Nov 24 06:53:38.712150 kernel: Linux version 6.12.58-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Sun Nov 23 20:49:05 -00 2025 Nov 24 06:53:38.712166 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a5a093dfb613b73c778207057706f88d5254927e05ae90617f314b938bd34a14 Nov 24 06:53:38.712172 kernel: Disabled fast string operations Nov 24 06:53:38.712176 kernel: BIOS-provided physical RAM map: Nov 24 06:53:38.712180 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Nov 24 06:53:38.712184 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Nov 24 06:53:38.712189 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Nov 24 06:53:38.712194 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Nov 24 06:53:38.712198 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Nov 24 06:53:38.712202 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Nov 24 06:53:38.712207 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Nov 24 06:53:38.712211 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Nov 24 06:53:38.712215 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Nov 24 06:53:38.712219 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Nov 24 06:53:38.712225 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Nov 24 06:53:38.712230 kernel: NX (Execute Disable) protection: active Nov 24 06:53:38.712234 kernel: APIC: Static calls initialized Nov 24 06:53:38.712239 kernel: SMBIOS 2.7 present. Nov 24 06:53:38.712244 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Nov 24 06:53:38.712249 kernel: DMI: Memory slots populated: 1/128 Nov 24 06:53:38.712253 kernel: vmware: hypercall mode: 0x00 Nov 24 06:53:38.712258 kernel: Hypervisor detected: VMware Nov 24 06:53:38.712263 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Nov 24 06:53:38.712268 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Nov 24 06:53:38.712273 kernel: vmware: using clock offset of 3241391531 ns Nov 24 06:53:38.712278 kernel: tsc: Detected 3408.000 MHz processor Nov 24 06:53:38.712283 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Nov 24 06:53:38.712288 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Nov 24 06:53:38.712293 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Nov 24 06:53:38.712298 kernel: total RAM covered: 3072M Nov 24 06:53:38.712303 kernel: Found optimal setting for mtrr clean up Nov 24 06:53:38.712308 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Nov 24 06:53:38.712313 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Nov 24 06:53:38.712319 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Nov 24 06:53:38.712324 kernel: Using GB pages for direct mapping Nov 24 06:53:38.712329 kernel: ACPI: Early table checksum verification disabled Nov 24 06:53:38.712333 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Nov 24 06:53:38.712338 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Nov 24 06:53:38.712343 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Nov 24 06:53:38.712348 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Nov 24 06:53:38.712355 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Nov 24 06:53:38.712361 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Nov 24 06:53:38.712366 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Nov 24 06:53:38.712371 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Nov 24 06:53:38.712376 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Nov 24 06:53:38.712381 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Nov 24 06:53:38.712386 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Nov 24 06:53:38.712392 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Nov 24 06:53:38.712397 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Nov 24 06:53:38.712402 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Nov 24 06:53:38.712407 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Nov 24 06:53:38.712412 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Nov 24 06:53:38.712417 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Nov 24 06:53:38.712422 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Nov 24 06:53:38.712427 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Nov 24 06:53:38.712431 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Nov 24 06:53:38.712437 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Nov 24 06:53:38.712442 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Nov 24 06:53:38.712447 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Nov 24 06:53:38.712452 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Nov 24 06:53:38.712457 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Nov 24 06:53:38.712462 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Nov 24 06:53:38.712467 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Nov 24 06:53:38.712472 kernel: Zone ranges: Nov 24 06:53:38.712477 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Nov 24 06:53:38.712482 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Nov 24 06:53:38.712488 kernel: Normal empty Nov 24 06:53:38.712493 kernel: Device empty Nov 24 06:53:38.712498 kernel: Movable zone start for each node Nov 24 06:53:38.712503 kernel: Early memory node ranges Nov 24 06:53:38.712508 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Nov 24 06:53:38.712513 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Nov 24 06:53:38.712518 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Nov 24 06:53:38.712522 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Nov 24 06:53:38.712527 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Nov 24 06:53:38.712533 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Nov 24 06:53:38.712538 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Nov 24 06:53:38.712543 kernel: ACPI: PM-Timer IO Port: 0x1008 Nov 24 06:53:38.712548 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Nov 24 06:53:38.712553 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Nov 24 06:53:38.712558 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Nov 24 06:53:38.712563 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Nov 24 06:53:38.712568 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Nov 24 06:53:38.712573 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Nov 24 06:53:38.712578 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Nov 24 06:53:38.712584 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Nov 24 06:53:38.712589 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Nov 24 06:53:38.712593 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Nov 24 06:53:38.712598 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Nov 24 06:53:38.712603 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Nov 24 06:53:38.712608 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Nov 24 06:53:38.712613 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Nov 24 06:53:38.712618 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Nov 24 06:53:38.712623 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Nov 24 06:53:38.712627 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Nov 24 06:53:38.712633 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Nov 24 06:53:38.712638 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Nov 24 06:53:38.712643 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Nov 24 06:53:38.712648 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Nov 24 06:53:38.712653 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Nov 24 06:53:38.712693 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Nov 24 06:53:38.712698 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Nov 24 06:53:38.712703 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Nov 24 06:53:38.712708 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Nov 24 06:53:38.712715 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Nov 24 06:53:38.712720 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Nov 24 06:53:38.712725 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Nov 24 06:53:38.712730 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Nov 24 06:53:38.712735 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Nov 24 06:53:38.712740 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Nov 24 06:53:38.712745 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Nov 24 06:53:38.712750 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Nov 24 06:53:38.712755 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Nov 24 06:53:38.712759 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Nov 24 06:53:38.712766 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Nov 24 06:53:38.712771 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Nov 24 06:53:38.712775 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Nov 24 06:53:38.712780 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Nov 24 06:53:38.712789 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Nov 24 06:53:38.712795 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Nov 24 06:53:38.712800 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Nov 24 06:53:38.712806 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Nov 24 06:53:38.712811 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Nov 24 06:53:38.712817 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Nov 24 06:53:38.712822 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Nov 24 06:53:38.712827 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Nov 24 06:53:38.712833 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Nov 24 06:53:38.712838 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Nov 24 06:53:38.712843 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Nov 24 06:53:38.712848 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Nov 24 06:53:38.712853 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Nov 24 06:53:38.712859 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Nov 24 06:53:38.712864 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Nov 24 06:53:38.712870 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Nov 24 06:53:38.712875 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Nov 24 06:53:38.712880 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Nov 24 06:53:38.712886 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Nov 24 06:53:38.712891 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Nov 24 06:53:38.712896 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Nov 24 06:53:38.712901 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Nov 24 06:53:38.712906 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Nov 24 06:53:38.712911 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Nov 24 06:53:38.712918 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Nov 24 06:53:38.712923 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Nov 24 06:53:38.712928 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Nov 24 06:53:38.712933 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Nov 24 06:53:38.712939 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Nov 24 06:53:38.712944 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Nov 24 06:53:38.712949 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Nov 24 06:53:38.712954 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Nov 24 06:53:38.712959 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Nov 24 06:53:38.712964 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Nov 24 06:53:38.712971 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Nov 24 06:53:38.712976 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Nov 24 06:53:38.712981 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Nov 24 06:53:38.712986 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Nov 24 06:53:38.712991 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Nov 24 06:53:38.712997 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Nov 24 06:53:38.713002 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Nov 24 06:53:38.713007 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Nov 24 06:53:38.713012 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Nov 24 06:53:38.713017 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Nov 24 06:53:38.713023 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Nov 24 06:53:38.713029 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Nov 24 06:53:38.713034 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Nov 24 06:53:38.713039 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Nov 24 06:53:38.713044 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Nov 24 06:53:38.713049 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Nov 24 06:53:38.713054 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Nov 24 06:53:38.713060 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Nov 24 06:53:38.713065 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Nov 24 06:53:38.713070 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Nov 24 06:53:38.713076 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Nov 24 06:53:38.713081 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Nov 24 06:53:38.713087 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Nov 24 06:53:38.713092 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Nov 24 06:53:38.713097 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Nov 24 06:53:38.713102 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Nov 24 06:53:38.713107 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Nov 24 06:53:38.713112 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Nov 24 06:53:38.713117 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Nov 24 06:53:38.713123 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Nov 24 06:53:38.713129 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Nov 24 06:53:38.713134 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Nov 24 06:53:38.713139 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Nov 24 06:53:38.713144 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Nov 24 06:53:38.713150 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Nov 24 06:53:38.713155 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Nov 24 06:53:38.713160 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Nov 24 06:53:38.713165 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Nov 24 06:53:38.713170 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Nov 24 06:53:38.713176 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Nov 24 06:53:38.713182 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Nov 24 06:53:38.713187 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Nov 24 06:53:38.713192 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Nov 24 06:53:38.713197 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Nov 24 06:53:38.713202 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Nov 24 06:53:38.713207 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Nov 24 06:53:38.713213 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Nov 24 06:53:38.713218 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Nov 24 06:53:38.713223 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Nov 24 06:53:38.713228 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Nov 24 06:53:38.713234 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Nov 24 06:53:38.713240 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Nov 24 06:53:38.713245 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Nov 24 06:53:38.713250 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Nov 24 06:53:38.713255 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Nov 24 06:53:38.713260 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Nov 24 06:53:38.713266 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Nov 24 06:53:38.713271 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Nov 24 06:53:38.713277 kernel: TSC deadline timer available Nov 24 06:53:38.713283 kernel: CPU topo: Max. logical packages: 128 Nov 24 06:53:38.713289 kernel: CPU topo: Max. logical dies: 128 Nov 24 06:53:38.713294 kernel: CPU topo: Max. dies per package: 1 Nov 24 06:53:38.713299 kernel: CPU topo: Max. threads per core: 1 Nov 24 06:53:38.713304 kernel: CPU topo: Num. cores per package: 1 Nov 24 06:53:38.713309 kernel: CPU topo: Num. threads per package: 1 Nov 24 06:53:38.713314 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Nov 24 06:53:38.713319 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Nov 24 06:53:38.713325 kernel: Booting paravirtualized kernel on VMware hypervisor Nov 24 06:53:38.713330 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Nov 24 06:53:38.713336 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Nov 24 06:53:38.713342 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Nov 24 06:53:38.713347 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Nov 24 06:53:38.713353 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Nov 24 06:53:38.713358 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Nov 24 06:53:38.713363 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Nov 24 06:53:38.713368 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Nov 24 06:53:38.713373 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Nov 24 06:53:38.713379 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Nov 24 06:53:38.713385 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Nov 24 06:53:38.713390 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Nov 24 06:53:38.713395 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Nov 24 06:53:38.713400 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Nov 24 06:53:38.713405 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Nov 24 06:53:38.713411 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Nov 24 06:53:38.713416 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Nov 24 06:53:38.713421 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Nov 24 06:53:38.713426 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Nov 24 06:53:38.713433 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Nov 24 06:53:38.713439 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a5a093dfb613b73c778207057706f88d5254927e05ae90617f314b938bd34a14 Nov 24 06:53:38.713444 kernel: random: crng init done Nov 24 06:53:38.713450 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Nov 24 06:53:38.713455 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Nov 24 06:53:38.713460 kernel: printk: log_buf_len min size: 262144 bytes Nov 24 06:53:38.713465 kernel: printk: log_buf_len: 1048576 bytes Nov 24 06:53:38.713471 kernel: printk: early log buf free: 245704(93%) Nov 24 06:53:38.713477 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Nov 24 06:53:38.713483 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Nov 24 06:53:38.713488 kernel: Fallback order for Node 0: 0 Nov 24 06:53:38.713493 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Nov 24 06:53:38.713498 kernel: Policy zone: DMA32 Nov 24 06:53:38.713504 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Nov 24 06:53:38.713509 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Nov 24 06:53:38.713514 kernel: ftrace: allocating 40103 entries in 157 pages Nov 24 06:53:38.713520 kernel: ftrace: allocated 157 pages with 5 groups Nov 24 06:53:38.713526 kernel: Dynamic Preempt: voluntary Nov 24 06:53:38.713531 kernel: rcu: Preemptible hierarchical RCU implementation. Nov 24 06:53:38.713537 kernel: rcu: RCU event tracing is enabled. Nov 24 06:53:38.713542 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Nov 24 06:53:38.713547 kernel: Trampoline variant of Tasks RCU enabled. Nov 24 06:53:38.713553 kernel: Rude variant of Tasks RCU enabled. Nov 24 06:53:38.713558 kernel: Tracing variant of Tasks RCU enabled. Nov 24 06:53:38.713563 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Nov 24 06:53:38.713569 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Nov 24 06:53:38.713574 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Nov 24 06:53:38.713580 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Nov 24 06:53:38.713586 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Nov 24 06:53:38.713591 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Nov 24 06:53:38.713596 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Nov 24 06:53:38.713602 kernel: Console: colour VGA+ 80x25 Nov 24 06:53:38.713607 kernel: printk: legacy console [tty0] enabled Nov 24 06:53:38.713612 kernel: printk: legacy console [ttyS0] enabled Nov 24 06:53:38.713617 kernel: ACPI: Core revision 20240827 Nov 24 06:53:38.713623 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Nov 24 06:53:38.713629 kernel: APIC: Switch to symmetric I/O mode setup Nov 24 06:53:38.713635 kernel: x2apic enabled Nov 24 06:53:38.713640 kernel: APIC: Switched APIC routing to: physical x2apic Nov 24 06:53:38.713645 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Nov 24 06:53:38.713651 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Nov 24 06:53:38.713663 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Nov 24 06:53:38.713677 kernel: Disabled fast string operations Nov 24 06:53:38.713682 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Nov 24 06:53:38.713688 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Nov 24 06:53:38.713696 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Nov 24 06:53:38.713701 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Nov 24 06:53:38.713706 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Nov 24 06:53:38.713712 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Nov 24 06:53:38.713717 kernel: RETBleed: Mitigation: Enhanced IBRS Nov 24 06:53:38.713723 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Nov 24 06:53:38.713732 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Nov 24 06:53:38.713737 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Nov 24 06:53:38.713761 kernel: SRBDS: Unknown: Dependent on hypervisor status Nov 24 06:53:38.713767 kernel: GDS: Unknown: Dependent on hypervisor status Nov 24 06:53:38.713773 kernel: active return thunk: its_return_thunk Nov 24 06:53:38.713799 kernel: ITS: Mitigation: Aligned branch/return thunks Nov 24 06:53:38.713805 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Nov 24 06:53:38.713811 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Nov 24 06:53:38.713816 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Nov 24 06:53:38.713821 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Nov 24 06:53:38.713827 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Nov 24 06:53:38.713834 kernel: Freeing SMP alternatives memory: 32K Nov 24 06:53:38.713839 kernel: pid_max: default: 131072 minimum: 1024 Nov 24 06:53:38.713844 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Nov 24 06:53:38.713850 kernel: landlock: Up and running. Nov 24 06:53:38.713855 kernel: SELinux: Initializing. Nov 24 06:53:38.713860 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Nov 24 06:53:38.713866 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Nov 24 06:53:38.713871 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Nov 24 06:53:38.713876 kernel: Performance Events: Skylake events, core PMU driver. Nov 24 06:53:38.713883 kernel: core: CPUID marked event: 'cpu cycles' unavailable Nov 24 06:53:38.713889 kernel: core: CPUID marked event: 'instructions' unavailable Nov 24 06:53:38.713894 kernel: core: CPUID marked event: 'bus cycles' unavailable Nov 24 06:53:38.713899 kernel: core: CPUID marked event: 'cache references' unavailable Nov 24 06:53:38.713904 kernel: core: CPUID marked event: 'cache misses' unavailable Nov 24 06:53:38.713910 kernel: core: CPUID marked event: 'branch instructions' unavailable Nov 24 06:53:38.713915 kernel: core: CPUID marked event: 'branch misses' unavailable Nov 24 06:53:38.713920 kernel: ... version: 1 Nov 24 06:53:38.713925 kernel: ... bit width: 48 Nov 24 06:53:38.713932 kernel: ... generic registers: 4 Nov 24 06:53:38.713937 kernel: ... value mask: 0000ffffffffffff Nov 24 06:53:38.713942 kernel: ... max period: 000000007fffffff Nov 24 06:53:38.713948 kernel: ... fixed-purpose events: 0 Nov 24 06:53:38.713953 kernel: ... event mask: 000000000000000f Nov 24 06:53:38.713958 kernel: signal: max sigframe size: 1776 Nov 24 06:53:38.713964 kernel: rcu: Hierarchical SRCU implementation. Nov 24 06:53:38.713969 kernel: rcu: Max phase no-delay instances is 400. Nov 24 06:53:38.713975 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Nov 24 06:53:38.713981 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Nov 24 06:53:38.713987 kernel: smp: Bringing up secondary CPUs ... Nov 24 06:53:38.713992 kernel: smpboot: x86: Booting SMP configuration: Nov 24 06:53:38.713997 kernel: .... node #0, CPUs: #1 Nov 24 06:53:38.714003 kernel: Disabled fast string operations Nov 24 06:53:38.714008 kernel: smp: Brought up 1 node, 2 CPUs Nov 24 06:53:38.714013 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Nov 24 06:53:38.714019 kernel: Memory: 1916068K/2096628K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46200K init, 2560K bss, 169176K reserved, 0K cma-reserved) Nov 24 06:53:38.714024 kernel: devtmpfs: initialized Nov 24 06:53:38.714031 kernel: x86/mm: Memory block size: 128MB Nov 24 06:53:38.714036 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Nov 24 06:53:38.714042 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Nov 24 06:53:38.714047 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Nov 24 06:53:38.714052 kernel: pinctrl core: initialized pinctrl subsystem Nov 24 06:53:38.714058 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Nov 24 06:53:38.714063 kernel: audit: initializing netlink subsys (disabled) Nov 24 06:53:38.714069 kernel: audit: type=2000 audit(1763967216.266:1): state=initialized audit_enabled=0 res=1 Nov 24 06:53:38.714074 kernel: thermal_sys: Registered thermal governor 'step_wise' Nov 24 06:53:38.714080 kernel: thermal_sys: Registered thermal governor 'user_space' Nov 24 06:53:38.714085 kernel: cpuidle: using governor menu Nov 24 06:53:38.714091 kernel: Simple Boot Flag at 0x36 set to 0x80 Nov 24 06:53:38.714096 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Nov 24 06:53:38.714102 kernel: dca service started, version 1.12.1 Nov 24 06:53:38.714107 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Nov 24 06:53:38.714121 kernel: PCI: Using configuration type 1 for base access Nov 24 06:53:38.714128 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Nov 24 06:53:38.714134 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Nov 24 06:53:38.714140 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Nov 24 06:53:38.714146 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Nov 24 06:53:38.714151 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Nov 24 06:53:38.714157 kernel: ACPI: Added _OSI(Module Device) Nov 24 06:53:38.714163 kernel: ACPI: Added _OSI(Processor Device) Nov 24 06:53:38.714168 kernel: ACPI: Added _OSI(Processor Aggregator Device) Nov 24 06:53:38.714174 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Nov 24 06:53:38.714179 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Nov 24 06:53:38.714184 kernel: ACPI: Interpreter enabled Nov 24 06:53:38.714191 kernel: ACPI: PM: (supports S0 S1 S5) Nov 24 06:53:38.714197 kernel: ACPI: Using IOAPIC for interrupt routing Nov 24 06:53:38.714203 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Nov 24 06:53:38.714208 kernel: PCI: Using E820 reservations for host bridge windows Nov 24 06:53:38.714214 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Nov 24 06:53:38.714220 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Nov 24 06:53:38.714297 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Nov 24 06:53:38.714348 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Nov 24 06:53:38.714398 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Nov 24 06:53:38.714406 kernel: PCI host bridge to bus 0000:00 Nov 24 06:53:38.714457 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Nov 24 06:53:38.714501 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Nov 24 06:53:38.714543 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Nov 24 06:53:38.714584 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Nov 24 06:53:38.714625 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Nov 24 06:53:38.715144 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Nov 24 06:53:38.715248 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Nov 24 06:53:38.715342 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Nov 24 06:53:38.715394 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Nov 24 06:53:38.715451 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Nov 24 06:53:38.715504 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Nov 24 06:53:38.715553 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Nov 24 06:53:38.715601 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Nov 24 06:53:38.715649 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Nov 24 06:53:38.715712 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Nov 24 06:53:38.715763 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Nov 24 06:53:38.715817 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Nov 24 06:53:38.715867 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Nov 24 06:53:38.715915 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Nov 24 06:53:38.715966 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Nov 24 06:53:38.716014 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Nov 24 06:53:38.716064 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Nov 24 06:53:38.716115 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Nov 24 06:53:38.716163 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Nov 24 06:53:38.716209 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Nov 24 06:53:38.716256 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Nov 24 06:53:38.716302 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Nov 24 06:53:38.716348 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Nov 24 06:53:38.716401 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Nov 24 06:53:38.716451 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Nov 24 06:53:38.716498 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Nov 24 06:53:38.716545 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Nov 24 06:53:38.716591 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Nov 24 06:53:38.716642 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.716711 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Nov 24 06:53:38.716764 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Nov 24 06:53:38.716812 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Nov 24 06:53:38.716859 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.716914 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.716963 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Nov 24 06:53:38.717010 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Nov 24 06:53:38.717058 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Nov 24 06:53:38.717108 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Nov 24 06:53:38.717156 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.717210 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.717257 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Nov 24 06:53:38.717304 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Nov 24 06:53:38.717351 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Nov 24 06:53:38.717400 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Nov 24 06:53:38.717447 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.717499 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.717548 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Nov 24 06:53:38.717595 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Nov 24 06:53:38.717642 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Nov 24 06:53:38.717704 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.717863 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.717920 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Nov 24 06:53:38.717971 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Nov 24 06:53:38.718020 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Nov 24 06:53:38.718069 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.718123 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.718172 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Nov 24 06:53:38.718224 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Nov 24 06:53:38.718271 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Nov 24 06:53:38.718318 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.718369 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.718417 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Nov 24 06:53:38.718467 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Nov 24 06:53:38.718513 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Nov 24 06:53:38.718560 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.718614 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.718681 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Nov 24 06:53:38.718732 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Nov 24 06:53:38.718780 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Nov 24 06:53:38.718828 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.718880 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.718928 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Nov 24 06:53:38.718978 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Nov 24 06:53:38.719026 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Nov 24 06:53:38.719073 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.719126 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.719175 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Nov 24 06:53:38.719222 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Nov 24 06:53:38.719269 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Nov 24 06:53:38.719319 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Nov 24 06:53:38.719365 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.719416 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.719466 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Nov 24 06:53:38.719514 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Nov 24 06:53:38.719561 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Nov 24 06:53:38.719626 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Nov 24 06:53:38.719709 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.719794 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.719869 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Nov 24 06:53:38.719919 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Nov 24 06:53:38.719967 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Nov 24 06:53:38.720017 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.720085 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.720136 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Nov 24 06:53:38.720183 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Nov 24 06:53:38.720231 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Nov 24 06:53:38.720278 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.720329 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.720378 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Nov 24 06:53:38.720425 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Nov 24 06:53:38.720474 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Nov 24 06:53:38.720522 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.720576 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.720624 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Nov 24 06:53:38.720692 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Nov 24 06:53:38.720751 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Nov 24 06:53:38.720917 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.720974 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.721027 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Nov 24 06:53:38.721075 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Nov 24 06:53:38.721123 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Nov 24 06:53:38.721169 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.721221 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.721269 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Nov 24 06:53:38.721316 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Nov 24 06:53:38.721366 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Nov 24 06:53:38.721414 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Nov 24 06:53:38.721461 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.721530 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.721579 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Nov 24 06:53:38.721643 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Nov 24 06:53:38.721701 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Nov 24 06:53:38.721748 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Nov 24 06:53:38.721796 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.721849 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.721898 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Nov 24 06:53:38.721950 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Nov 24 06:53:38.721997 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Nov 24 06:53:38.722043 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Nov 24 06:53:38.722090 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.722141 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.722188 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Nov 24 06:53:38.722235 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Nov 24 06:53:38.722283 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Nov 24 06:53:38.722330 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.722381 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.722428 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Nov 24 06:53:38.722474 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Nov 24 06:53:38.722521 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Nov 24 06:53:38.722568 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.722638 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.722724 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Nov 24 06:53:38.722772 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Nov 24 06:53:38.722820 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Nov 24 06:53:38.722867 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.722920 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.722968 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Nov 24 06:53:38.723017 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Nov 24 06:53:38.723065 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Nov 24 06:53:38.723112 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.723163 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.723211 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Nov 24 06:53:38.723258 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Nov 24 06:53:38.723305 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Nov 24 06:53:38.723352 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.723405 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.723453 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Nov 24 06:53:38.723500 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Nov 24 06:53:38.723548 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Nov 24 06:53:38.723595 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Nov 24 06:53:38.723642 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.723704 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.723782 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Nov 24 06:53:38.723832 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Nov 24 06:53:38.723880 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Nov 24 06:53:38.723928 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Nov 24 06:53:38.723977 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.724034 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.724083 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Nov 24 06:53:38.724133 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Nov 24 06:53:38.724181 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Nov 24 06:53:38.724229 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.724282 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.724332 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Nov 24 06:53:38.724380 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Nov 24 06:53:38.724428 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Nov 24 06:53:38.724478 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.724530 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.724579 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Nov 24 06:53:38.724628 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Nov 24 06:53:38.724693 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Nov 24 06:53:38.724742 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.724795 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.724847 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Nov 24 06:53:38.724895 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Nov 24 06:53:38.724942 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Nov 24 06:53:38.724990 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.725042 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.725090 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Nov 24 06:53:38.725138 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Nov 24 06:53:38.725188 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Nov 24 06:53:38.725236 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.725289 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Nov 24 06:53:38.725338 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Nov 24 06:53:38.725386 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Nov 24 06:53:38.725434 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Nov 24 06:53:38.725481 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.725535 kernel: pci_bus 0000:01: extended config space not accessible Nov 24 06:53:38.725587 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Nov 24 06:53:38.725636 kernel: pci_bus 0000:02: extended config space not accessible Nov 24 06:53:38.725645 kernel: acpiphp: Slot [32] registered Nov 24 06:53:38.725651 kernel: acpiphp: Slot [33] registered Nov 24 06:53:38.725680 kernel: acpiphp: Slot [34] registered Nov 24 06:53:38.725686 kernel: acpiphp: Slot [35] registered Nov 24 06:53:38.725692 kernel: acpiphp: Slot [36] registered Nov 24 06:53:38.725698 kernel: acpiphp: Slot [37] registered Nov 24 06:53:38.725705 kernel: acpiphp: Slot [38] registered Nov 24 06:53:38.725711 kernel: acpiphp: Slot [39] registered Nov 24 06:53:38.725717 kernel: acpiphp: Slot [40] registered Nov 24 06:53:38.725722 kernel: acpiphp: Slot [41] registered Nov 24 06:53:38.725728 kernel: acpiphp: Slot [42] registered Nov 24 06:53:38.725734 kernel: acpiphp: Slot [43] registered Nov 24 06:53:38.725740 kernel: acpiphp: Slot [44] registered Nov 24 06:53:38.725745 kernel: acpiphp: Slot [45] registered Nov 24 06:53:38.725751 kernel: acpiphp: Slot [46] registered Nov 24 06:53:38.725756 kernel: acpiphp: Slot [47] registered Nov 24 06:53:38.725763 kernel: acpiphp: Slot [48] registered Nov 24 06:53:38.725769 kernel: acpiphp: Slot [49] registered Nov 24 06:53:38.725775 kernel: acpiphp: Slot [50] registered Nov 24 06:53:38.725781 kernel: acpiphp: Slot [51] registered Nov 24 06:53:38.725786 kernel: acpiphp: Slot [52] registered Nov 24 06:53:38.725792 kernel: acpiphp: Slot [53] registered Nov 24 06:53:38.725798 kernel: acpiphp: Slot [54] registered Nov 24 06:53:38.725803 kernel: acpiphp: Slot [55] registered Nov 24 06:53:38.725809 kernel: acpiphp: Slot [56] registered Nov 24 06:53:38.725816 kernel: acpiphp: Slot [57] registered Nov 24 06:53:38.725821 kernel: acpiphp: Slot [58] registered Nov 24 06:53:38.725827 kernel: acpiphp: Slot [59] registered Nov 24 06:53:38.725833 kernel: acpiphp: Slot [60] registered Nov 24 06:53:38.725839 kernel: acpiphp: Slot [61] registered Nov 24 06:53:38.725844 kernel: acpiphp: Slot [62] registered Nov 24 06:53:38.725850 kernel: acpiphp: Slot [63] registered Nov 24 06:53:38.725901 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Nov 24 06:53:38.725950 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Nov 24 06:53:38.726000 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Nov 24 06:53:38.726048 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Nov 24 06:53:38.726096 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Nov 24 06:53:38.726144 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Nov 24 06:53:38.726198 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Nov 24 06:53:38.726268 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Nov 24 06:53:38.726333 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Nov 24 06:53:38.726422 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Nov 24 06:53:38.726472 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Nov 24 06:53:38.726520 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Nov 24 06:53:38.726569 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Nov 24 06:53:38.726618 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Nov 24 06:53:38.726678 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Nov 24 06:53:38.726729 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Nov 24 06:53:38.726781 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Nov 24 06:53:38.726832 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Nov 24 06:53:38.726881 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Nov 24 06:53:38.726929 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Nov 24 06:53:38.726982 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Nov 24 06:53:38.727032 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Nov 24 06:53:38.727081 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Nov 24 06:53:38.727132 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Nov 24 06:53:38.727181 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Nov 24 06:53:38.727230 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Nov 24 06:53:38.727279 kernel: pci 0000:0b:00.0: supports D1 D2 Nov 24 06:53:38.727328 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Nov 24 06:53:38.727377 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Nov 24 06:53:38.727426 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Nov 24 06:53:38.727474 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Nov 24 06:53:38.727524 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Nov 24 06:53:38.727573 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Nov 24 06:53:38.727621 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Nov 24 06:53:38.727683 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Nov 24 06:53:38.727732 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Nov 24 06:53:38.727801 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Nov 24 06:53:38.727848 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Nov 24 06:53:38.727895 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Nov 24 06:53:38.727945 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Nov 24 06:53:38.727992 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Nov 24 06:53:38.728039 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Nov 24 06:53:38.728086 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Nov 24 06:53:38.728134 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Nov 24 06:53:38.728180 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Nov 24 06:53:38.728227 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Nov 24 06:53:38.728276 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Nov 24 06:53:38.728323 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Nov 24 06:53:38.728370 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Nov 24 06:53:38.728418 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Nov 24 06:53:38.728465 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Nov 24 06:53:38.728514 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Nov 24 06:53:38.728561 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Nov 24 06:53:38.728571 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Nov 24 06:53:38.728577 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Nov 24 06:53:38.728583 kernel: ACPI: PCI: Interrupt link LNKB disabled Nov 24 06:53:38.728589 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Nov 24 06:53:38.728595 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Nov 24 06:53:38.728600 kernel: iommu: Default domain type: Translated Nov 24 06:53:38.728606 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Nov 24 06:53:38.728612 kernel: PCI: Using ACPI for IRQ routing Nov 24 06:53:38.728617 kernel: PCI: pci_cache_line_size set to 64 bytes Nov 24 06:53:38.728624 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Nov 24 06:53:38.728630 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Nov 24 06:53:38.728689 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Nov 24 06:53:38.728738 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Nov 24 06:53:38.728786 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Nov 24 06:53:38.728794 kernel: vgaarb: loaded Nov 24 06:53:38.728800 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Nov 24 06:53:38.728806 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Nov 24 06:53:38.728811 kernel: clocksource: Switched to clocksource tsc-early Nov 24 06:53:38.728819 kernel: VFS: Disk quotas dquot_6.6.0 Nov 24 06:53:38.728825 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Nov 24 06:53:38.728831 kernel: pnp: PnP ACPI init Nov 24 06:53:38.728883 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Nov 24 06:53:38.728928 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Nov 24 06:53:38.728970 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Nov 24 06:53:38.729016 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Nov 24 06:53:38.729065 kernel: pnp 00:06: [dma 2] Nov 24 06:53:38.729113 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Nov 24 06:53:38.729157 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Nov 24 06:53:38.729200 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Nov 24 06:53:38.729208 kernel: pnp: PnP ACPI: found 8 devices Nov 24 06:53:38.729232 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Nov 24 06:53:38.729238 kernel: NET: Registered PF_INET protocol family Nov 24 06:53:38.729246 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Nov 24 06:53:38.729252 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Nov 24 06:53:38.729257 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Nov 24 06:53:38.729263 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Nov 24 06:53:38.729269 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Nov 24 06:53:38.729275 kernel: TCP: Hash tables configured (established 16384 bind 16384) Nov 24 06:53:38.729280 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Nov 24 06:53:38.729286 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Nov 24 06:53:38.729292 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Nov 24 06:53:38.729299 kernel: NET: Registered PF_XDP protocol family Nov 24 06:53:38.729903 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Nov 24 06:53:38.731229 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Nov 24 06:53:38.731287 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Nov 24 06:53:38.731338 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Nov 24 06:53:38.731388 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Nov 24 06:53:38.731437 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Nov 24 06:53:38.731486 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Nov 24 06:53:38.731538 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Nov 24 06:53:38.731587 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Nov 24 06:53:38.731635 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Nov 24 06:53:38.731700 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Nov 24 06:53:38.731748 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Nov 24 06:53:38.731797 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Nov 24 06:53:38.731844 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Nov 24 06:53:38.731891 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Nov 24 06:53:38.731942 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Nov 24 06:53:38.731989 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Nov 24 06:53:38.732037 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Nov 24 06:53:38.732085 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Nov 24 06:53:38.732132 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Nov 24 06:53:38.732180 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Nov 24 06:53:38.732228 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Nov 24 06:53:38.732277 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Nov 24 06:53:38.732325 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Nov 24 06:53:38.732372 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Nov 24 06:53:38.732427 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.732500 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.732560 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.732627 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.733815 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.733871 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.733930 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.733988 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.734053 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.734102 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.734150 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.734197 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.734245 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.734296 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.734345 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.734393 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.734442 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.734489 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.734537 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.734586 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.734634 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.735735 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.735793 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.735846 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.735896 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.735945 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.735993 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.736042 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.736091 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.736142 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.736191 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.736240 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.736288 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.736336 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.736384 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.736432 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.736483 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.736531 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.736578 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.736626 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.738697 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.738753 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.738803 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.738852 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.738900 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.738951 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.738998 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739045 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739092 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739139 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739186 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739233 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739280 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739327 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739375 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739422 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739469 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739515 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739562 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739609 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739661 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739710 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739758 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739865 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.739918 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.739979 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.740028 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.740075 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.740122 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.740168 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.740215 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.740262 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.740308 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.740358 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.740407 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.740453 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.740502 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.740550 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.740596 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.740645 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.741968 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.742024 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.742075 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Nov 24 06:53:38.742264 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Nov 24 06:53:38.742318 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Nov 24 06:53:38.742378 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Nov 24 06:53:38.742427 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Nov 24 06:53:38.742475 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Nov 24 06:53:38.742526 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Nov 24 06:53:38.742577 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Nov 24 06:53:38.745019 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Nov 24 06:53:38.745095 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Nov 24 06:53:38.745147 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Nov 24 06:53:38.745198 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Nov 24 06:53:38.745248 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Nov 24 06:53:38.745297 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Nov 24 06:53:38.745345 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Nov 24 06:53:38.745398 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Nov 24 06:53:38.745448 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Nov 24 06:53:38.745496 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Nov 24 06:53:38.745544 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Nov 24 06:53:38.745592 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Nov 24 06:53:38.745640 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Nov 24 06:53:38.745704 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Nov 24 06:53:38.745755 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Nov 24 06:53:38.745804 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Nov 24 06:53:38.745856 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Nov 24 06:53:38.745904 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Nov 24 06:53:38.745952 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Nov 24 06:53:38.746000 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Nov 24 06:53:38.746049 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Nov 24 06:53:38.746097 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Nov 24 06:53:38.746145 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Nov 24 06:53:38.746196 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Nov 24 06:53:38.746245 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Nov 24 06:53:38.746294 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Nov 24 06:53:38.746342 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Nov 24 06:53:38.746394 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Nov 24 06:53:38.746444 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Nov 24 06:53:38.746492 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Nov 24 06:53:38.746541 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Nov 24 06:53:38.746592 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Nov 24 06:53:38.746641 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Nov 24 06:53:38.747135 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Nov 24 06:53:38.747191 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Nov 24 06:53:38.747243 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Nov 24 06:53:38.747295 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Nov 24 06:53:38.747345 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Nov 24 06:53:38.747394 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Nov 24 06:53:38.747443 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Nov 24 06:53:38.747496 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Nov 24 06:53:38.747546 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Nov 24 06:53:38.747594 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Nov 24 06:53:38.747643 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Nov 24 06:53:38.747724 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Nov 24 06:53:38.747817 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Nov 24 06:53:38.747868 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Nov 24 06:53:38.747916 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Nov 24 06:53:38.747968 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Nov 24 06:53:38.748016 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Nov 24 06:53:38.748065 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Nov 24 06:53:38.748113 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Nov 24 06:53:38.748162 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Nov 24 06:53:38.748210 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Nov 24 06:53:38.748258 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Nov 24 06:53:38.748310 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Nov 24 06:53:38.748359 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Nov 24 06:53:38.748407 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Nov 24 06:53:38.748456 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Nov 24 06:53:38.748507 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Nov 24 06:53:38.748555 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Nov 24 06:53:38.748602 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Nov 24 06:53:38.748650 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Nov 24 06:53:38.748725 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Nov 24 06:53:38.748777 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Nov 24 06:53:38.748826 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Nov 24 06:53:38.748875 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Nov 24 06:53:38.748938 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Nov 24 06:53:38.748986 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Nov 24 06:53:38.749034 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Nov 24 06:53:38.749082 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Nov 24 06:53:38.749129 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Nov 24 06:53:38.749176 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Nov 24 06:53:38.749422 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Nov 24 06:53:38.749473 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Nov 24 06:53:38.749523 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Nov 24 06:53:38.749573 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Nov 24 06:53:38.749622 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Nov 24 06:53:38.749697 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Nov 24 06:53:38.749771 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Nov 24 06:53:38.749834 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Nov 24 06:53:38.749953 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Nov 24 06:53:38.750010 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Nov 24 06:53:38.750059 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Nov 24 06:53:38.750107 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Nov 24 06:53:38.750170 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Nov 24 06:53:38.750464 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Nov 24 06:53:38.750515 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Nov 24 06:53:38.750566 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Nov 24 06:53:38.750741 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Nov 24 06:53:38.750800 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Nov 24 06:53:38.750850 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Nov 24 06:53:38.750898 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Nov 24 06:53:38.750946 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Nov 24 06:53:38.750995 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Nov 24 06:53:38.751043 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Nov 24 06:53:38.751094 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Nov 24 06:53:38.751141 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Nov 24 06:53:38.751190 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Nov 24 06:53:38.751237 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Nov 24 06:53:38.751285 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Nov 24 06:53:38.751332 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Nov 24 06:53:38.751381 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Nov 24 06:53:38.751428 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Nov 24 06:53:38.751478 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Nov 24 06:53:38.751527 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Nov 24 06:53:38.751574 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Nov 24 06:53:38.751622 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Nov 24 06:53:38.751685 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Nov 24 06:53:38.751731 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Nov 24 06:53:38.751805 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Nov 24 06:53:38.751850 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Nov 24 06:53:38.751893 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Nov 24 06:53:38.752248 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Nov 24 06:53:38.752306 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Nov 24 06:53:38.752354 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Nov 24 06:53:38.752398 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Nov 24 06:53:38.752443 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Nov 24 06:53:38.752489 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Nov 24 06:53:38.752532 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Nov 24 06:53:38.752575 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Nov 24 06:53:38.752623 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Nov 24 06:53:38.753862 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Nov 24 06:53:38.753911 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Nov 24 06:53:38.753960 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Nov 24 06:53:38.754018 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Nov 24 06:53:38.754062 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Nov 24 06:53:38.754110 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Nov 24 06:53:38.754154 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Nov 24 06:53:38.754197 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Nov 24 06:53:38.754245 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Nov 24 06:53:38.754288 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Nov 24 06:53:38.754337 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Nov 24 06:53:38.754381 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Nov 24 06:53:38.754429 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Nov 24 06:53:38.754490 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Nov 24 06:53:38.754541 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Nov 24 06:53:38.754586 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Nov 24 06:53:38.754637 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Nov 24 06:53:38.754702 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Nov 24 06:53:38.754751 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Nov 24 06:53:38.754796 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Nov 24 06:53:38.754840 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Nov 24 06:53:38.754891 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Nov 24 06:53:38.754936 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Nov 24 06:53:38.754980 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Nov 24 06:53:38.755030 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Nov 24 06:53:38.755075 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Nov 24 06:53:38.755119 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Nov 24 06:53:38.755167 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Nov 24 06:53:38.755214 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Nov 24 06:53:38.755262 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Nov 24 06:53:38.755307 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Nov 24 06:53:38.755354 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Nov 24 06:53:38.755399 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Nov 24 06:53:38.755448 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Nov 24 06:53:38.755495 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Nov 24 06:53:38.755546 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Nov 24 06:53:38.755611 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Nov 24 06:53:38.755978 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Nov 24 06:53:38.756029 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Nov 24 06:53:38.756074 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Nov 24 06:53:38.756124 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Nov 24 06:53:38.756171 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Nov 24 06:53:38.756220 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Nov 24 06:53:38.756268 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Nov 24 06:53:38.756312 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Nov 24 06:53:38.756356 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Nov 24 06:53:38.756406 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Nov 24 06:53:38.756453 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Nov 24 06:53:38.756501 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Nov 24 06:53:38.756546 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Nov 24 06:53:38.756594 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Nov 24 06:53:38.756638 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Nov 24 06:53:38.757777 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Nov 24 06:53:38.757849 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Nov 24 06:53:38.757904 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Nov 24 06:53:38.757961 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Nov 24 06:53:38.758011 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Nov 24 06:53:38.758057 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Nov 24 06:53:38.758103 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Nov 24 06:53:38.758152 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Nov 24 06:53:38.758200 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Nov 24 06:53:38.758244 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Nov 24 06:53:38.758293 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Nov 24 06:53:38.758338 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Nov 24 06:53:38.758387 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Nov 24 06:53:38.758432 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Nov 24 06:53:38.758570 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Nov 24 06:53:38.758619 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Nov 24 06:53:38.758705 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Nov 24 06:53:38.758753 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Nov 24 06:53:38.758803 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Nov 24 06:53:38.758849 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Nov 24 06:53:38.758898 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Nov 24 06:53:38.758946 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Nov 24 06:53:38.759002 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Nov 24 06:53:38.759012 kernel: PCI: CLS 32 bytes, default 64 Nov 24 06:53:38.759018 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Nov 24 06:53:38.759025 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Nov 24 06:53:38.759031 kernel: clocksource: Switched to clocksource tsc Nov 24 06:53:38.759037 kernel: Initialise system trusted keyrings Nov 24 06:53:38.759043 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Nov 24 06:53:38.759051 kernel: Key type asymmetric registered Nov 24 06:53:38.759058 kernel: Asymmetric key parser 'x509' registered Nov 24 06:53:38.759063 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Nov 24 06:53:38.759070 kernel: io scheduler mq-deadline registered Nov 24 06:53:38.759075 kernel: io scheduler kyber registered Nov 24 06:53:38.759081 kernel: io scheduler bfq registered Nov 24 06:53:38.759133 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Nov 24 06:53:38.759186 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.759241 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Nov 24 06:53:38.759291 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.759342 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Nov 24 06:53:38.759394 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.759445 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Nov 24 06:53:38.759498 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.759550 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Nov 24 06:53:38.759602 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.759653 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Nov 24 06:53:38.759713 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.759790 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Nov 24 06:53:38.759852 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.759901 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Nov 24 06:53:38.759950 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760001 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Nov 24 06:53:38.760048 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760097 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Nov 24 06:53:38.760145 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760192 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Nov 24 06:53:38.760240 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760288 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Nov 24 06:53:38.760338 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760395 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Nov 24 06:53:38.760446 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760505 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Nov 24 06:53:38.760554 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760603 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Nov 24 06:53:38.760652 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760723 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Nov 24 06:53:38.760781 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760830 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Nov 24 06:53:38.760879 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.760928 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Nov 24 06:53:38.760976 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.761025 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Nov 24 06:53:38.761074 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.761125 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Nov 24 06:53:38.761174 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.761240 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Nov 24 06:53:38.761324 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.761373 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Nov 24 06:53:38.761423 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.761473 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Nov 24 06:53:38.761522 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.761574 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Nov 24 06:53:38.761623 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.761931 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Nov 24 06:53:38.761984 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.762034 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Nov 24 06:53:38.762082 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.762131 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Nov 24 06:53:38.762182 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.762231 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Nov 24 06:53:38.762298 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.762347 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Nov 24 06:53:38.762396 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.762446 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Nov 24 06:53:38.762495 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.762545 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Nov 24 06:53:38.762597 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.762647 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Nov 24 06:53:38.762706 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Nov 24 06:53:38.762717 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Nov 24 06:53:38.762724 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Nov 24 06:53:38.762730 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Nov 24 06:53:38.762736 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Nov 24 06:53:38.762744 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Nov 24 06:53:38.762750 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Nov 24 06:53:38.762806 kernel: rtc_cmos 00:01: registered as rtc0 Nov 24 06:53:38.762873 kernel: rtc_cmos 00:01: setting system clock to 2025-11-24T06:53:38 UTC (1763967218) Nov 24 06:53:38.762882 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Nov 24 06:53:38.762941 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Nov 24 06:53:38.762950 kernel: intel_pstate: CPU model not supported Nov 24 06:53:38.762956 kernel: NET: Registered PF_INET6 protocol family Nov 24 06:53:38.762964 kernel: Segment Routing with IPv6 Nov 24 06:53:38.762970 kernel: In-situ OAM (IOAM) with IPv6 Nov 24 06:53:38.762976 kernel: NET: Registered PF_PACKET protocol family Nov 24 06:53:38.762983 kernel: Key type dns_resolver registered Nov 24 06:53:38.762989 kernel: IPI shorthand broadcast: enabled Nov 24 06:53:38.762995 kernel: sched_clock: Marking stable (2539003215, 163522359)->(2718524989, -15999415) Nov 24 06:53:38.763001 kernel: registered taskstats version 1 Nov 24 06:53:38.763007 kernel: Loading compiled-in X.509 certificates Nov 24 06:53:38.763013 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.58-flatcar: 960cbe7f2b1ea74b5c881d6d42eea4d1ac19a607' Nov 24 06:53:38.763020 kernel: Demotion targets for Node 0: null Nov 24 06:53:38.763027 kernel: Key type .fscrypt registered Nov 24 06:53:38.763033 kernel: Key type fscrypt-provisioning registered Nov 24 06:53:38.763039 kernel: ima: No TPM chip found, activating TPM-bypass! Nov 24 06:53:38.763045 kernel: ima: Allocated hash algorithm: sha1 Nov 24 06:53:38.763051 kernel: ima: No architecture policies found Nov 24 06:53:38.763057 kernel: clk: Disabling unused clocks Nov 24 06:53:38.763063 kernel: Warning: unable to open an initial console. Nov 24 06:53:38.763069 kernel: Freeing unused kernel image (initmem) memory: 46200K Nov 24 06:53:38.763077 kernel: Write protecting the kernel read-only data: 40960k Nov 24 06:53:38.763083 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Nov 24 06:53:38.763089 kernel: Run /init as init process Nov 24 06:53:38.763095 kernel: with arguments: Nov 24 06:53:38.763101 kernel: /init Nov 24 06:53:38.763107 kernel: with environment: Nov 24 06:53:38.763113 kernel: HOME=/ Nov 24 06:53:38.763119 kernel: TERM=linux Nov 24 06:53:38.763126 systemd[1]: Successfully made /usr/ read-only. Nov 24 06:53:38.763136 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Nov 24 06:53:38.763143 systemd[1]: Detected virtualization vmware. Nov 24 06:53:38.763149 systemd[1]: Detected architecture x86-64. Nov 24 06:53:38.763155 systemd[1]: Running in initrd. Nov 24 06:53:38.763162 systemd[1]: No hostname configured, using default hostname. Nov 24 06:53:38.763168 systemd[1]: Hostname set to . Nov 24 06:53:38.763174 systemd[1]: Initializing machine ID from random generator. Nov 24 06:53:38.763182 systemd[1]: Queued start job for default target initrd.target. Nov 24 06:53:38.763188 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Nov 24 06:53:38.763195 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Nov 24 06:53:38.763202 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Nov 24 06:53:38.763208 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Nov 24 06:53:38.763214 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Nov 24 06:53:38.763221 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Nov 24 06:53:38.763229 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Nov 24 06:53:38.763236 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Nov 24 06:53:38.763242 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Nov 24 06:53:38.763249 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Nov 24 06:53:38.763255 systemd[1]: Reached target paths.target - Path Units. Nov 24 06:53:38.763261 systemd[1]: Reached target slices.target - Slice Units. Nov 24 06:53:38.763267 systemd[1]: Reached target swap.target - Swaps. Nov 24 06:53:38.763274 systemd[1]: Reached target timers.target - Timer Units. Nov 24 06:53:38.763281 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Nov 24 06:53:38.763288 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Nov 24 06:53:38.763294 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Nov 24 06:53:38.763300 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Nov 24 06:53:38.763307 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Nov 24 06:53:38.763313 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Nov 24 06:53:38.763321 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Nov 24 06:53:38.763328 systemd[1]: Reached target sockets.target - Socket Units. Nov 24 06:53:38.763334 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Nov 24 06:53:38.763342 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Nov 24 06:53:38.763348 systemd[1]: Finished network-cleanup.service - Network Cleanup. Nov 24 06:53:38.763354 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Nov 24 06:53:38.763361 systemd[1]: Starting systemd-fsck-usr.service... Nov 24 06:53:38.763367 systemd[1]: Starting systemd-journald.service - Journal Service... Nov 24 06:53:38.763374 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Nov 24 06:53:38.763380 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Nov 24 06:53:38.763386 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Nov 24 06:53:38.763394 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Nov 24 06:53:38.763401 systemd[1]: Finished systemd-fsck-usr.service. Nov 24 06:53:38.763407 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Nov 24 06:53:38.763427 systemd-journald[225]: Collecting audit messages is disabled. Nov 24 06:53:38.763444 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Nov 24 06:53:38.763451 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Nov 24 06:53:38.763457 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:53:38.763464 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Nov 24 06:53:38.763472 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Nov 24 06:53:38.763478 kernel: Bridge firewalling registered Nov 24 06:53:38.763484 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Nov 24 06:53:38.763491 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Nov 24 06:53:38.763497 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Nov 24 06:53:38.763504 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Nov 24 06:53:38.763511 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Nov 24 06:53:38.763517 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Nov 24 06:53:38.763524 systemd-journald[225]: Journal started Nov 24 06:53:38.763539 systemd-journald[225]: Runtime Journal (/run/log/journal/3a780ed87b4f446cb0a922330a367a36) is 4.8M, max 38.5M, 33.7M free. Nov 24 06:53:38.707918 systemd-modules-load[226]: Inserted module 'overlay' Nov 24 06:53:38.742365 systemd-modules-load[226]: Inserted module 'br_netfilter' Nov 24 06:53:38.771295 systemd[1]: Started systemd-journald.service - Journal Service. Nov 24 06:53:38.772263 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Nov 24 06:53:38.775925 dracut-cmdline[254]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a5a093dfb613b73c778207057706f88d5254927e05ae90617f314b938bd34a14 Nov 24 06:53:38.781866 systemd-tmpfiles[267]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Nov 24 06:53:38.783837 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Nov 24 06:53:38.784764 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Nov 24 06:53:38.813576 systemd-resolved[292]: Positive Trust Anchors: Nov 24 06:53:38.813697 systemd-resolved[292]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Nov 24 06:53:38.813721 systemd-resolved[292]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Nov 24 06:53:38.815202 systemd-resolved[292]: Defaulting to hostname 'linux'. Nov 24 06:53:38.815732 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Nov 24 06:53:38.815858 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Nov 24 06:53:38.834671 kernel: SCSI subsystem initialized Nov 24 06:53:38.850667 kernel: Loading iSCSI transport class v2.0-870. Nov 24 06:53:38.858670 kernel: iscsi: registered transport (tcp) Nov 24 06:53:38.878917 kernel: iscsi: registered transport (qla4xxx) Nov 24 06:53:38.878933 kernel: QLogic iSCSI HBA Driver Nov 24 06:53:38.889182 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Nov 24 06:53:38.903170 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Nov 24 06:53:38.904349 systemd[1]: Reached target network-pre.target - Preparation for Network. Nov 24 06:53:38.926794 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Nov 24 06:53:38.927692 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Nov 24 06:53:38.972669 kernel: raid6: avx2x4 gen() 47898 MB/s Nov 24 06:53:38.989669 kernel: raid6: avx2x2 gen() 55300 MB/s Nov 24 06:53:39.006859 kernel: raid6: avx2x1 gen() 45900 MB/s Nov 24 06:53:39.006877 kernel: raid6: using algorithm avx2x2 gen() 55300 MB/s Nov 24 06:53:39.024890 kernel: raid6: .... xor() 32152 MB/s, rmw enabled Nov 24 06:53:39.024904 kernel: raid6: using avx2x2 recovery algorithm Nov 24 06:53:39.038667 kernel: xor: automatically using best checksumming function avx Nov 24 06:53:39.140673 kernel: Btrfs loaded, zoned=no, fsverity=no Nov 24 06:53:39.144044 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Nov 24 06:53:39.145187 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Nov 24 06:53:39.164545 systemd-udevd[475]: Using default interface naming scheme 'v255'. Nov 24 06:53:39.168211 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Nov 24 06:53:39.168771 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Nov 24 06:53:39.186772 dracut-pre-trigger[477]: rd.md=0: removing MD RAID activation Nov 24 06:53:39.200455 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Nov 24 06:53:39.201326 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Nov 24 06:53:39.270268 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Nov 24 06:53:39.272249 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Nov 24 06:53:39.336621 kernel: VMware PVSCSI driver - version 1.0.7.0-k Nov 24 06:53:39.343117 kernel: vmw_pvscsi: using 64bit dma Nov 24 06:53:39.343142 kernel: vmw_pvscsi: max_id: 16 Nov 24 06:53:39.343150 kernel: vmw_pvscsi: setting ring_pages to 8 Nov 24 06:53:39.350741 kernel: vmw_pvscsi: enabling reqCallThreshold Nov 24 06:53:39.350758 kernel: vmw_pvscsi: driver-based request coalescing enabled Nov 24 06:53:39.350767 kernel: vmw_pvscsi: using MSI-X Nov 24 06:53:39.355727 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Nov 24 06:53:39.358377 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Nov 24 06:53:39.358523 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Nov 24 06:53:39.365535 (udev-worker)[519]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Nov 24 06:53:39.371835 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Nov 24 06:53:39.371906 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:53:39.372229 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Nov 24 06:53:39.373144 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Nov 24 06:53:39.376665 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Nov 24 06:53:39.376682 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Nov 24 06:53:39.378231 kernel: sd 0:0:0:0: [sda] Write Protect is off Nov 24 06:53:39.378311 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Nov 24 06:53:39.378372 kernel: sd 0:0:0:0: [sda] Cache data unavailable Nov 24 06:53:39.378815 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Nov 24 06:53:39.383666 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Nov 24 06:53:39.383813 kernel: libata version 3.00 loaded. Nov 24 06:53:39.383828 kernel: cryptd: max_cpu_qlen set to 1000 Nov 24 06:53:39.386671 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Nov 24 06:53:39.390670 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Nov 24 06:53:39.394664 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Nov 24 06:53:39.398539 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:53:39.402671 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Nov 24 06:53:39.406691 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Nov 24 06:53:39.408675 kernel: ata_piix 0000:00:07.1: version 2.13 Nov 24 06:53:39.408773 kernel: AES CTR mode by8 optimization enabled Nov 24 06:53:39.418870 kernel: scsi host1: ata_piix Nov 24 06:53:39.418956 kernel: scsi host2: ata_piix Nov 24 06:53:39.419922 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Nov 24 06:53:39.421808 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Nov 24 06:53:39.447192 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Nov 24 06:53:39.452589 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Nov 24 06:53:39.457710 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Nov 24 06:53:39.461838 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Nov 24 06:53:39.461973 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Nov 24 06:53:39.462613 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Nov 24 06:53:39.501671 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Nov 24 06:53:39.593702 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Nov 24 06:53:39.597676 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Nov 24 06:53:39.628221 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Nov 24 06:53:39.628368 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Nov 24 06:53:39.645672 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Nov 24 06:53:39.948421 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Nov 24 06:53:39.948984 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Nov 24 06:53:39.949290 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Nov 24 06:53:39.949562 systemd[1]: Reached target remote-fs.target - Remote File Systems. Nov 24 06:53:39.950416 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Nov 24 06:53:39.962835 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Nov 24 06:53:40.516934 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Nov 24 06:53:40.517197 disk-uuid[624]: The operation has completed successfully. Nov 24 06:53:40.555189 systemd[1]: disk-uuid.service: Deactivated successfully. Nov 24 06:53:40.555252 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Nov 24 06:53:40.565326 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Nov 24 06:53:40.574411 sh[656]: Success Nov 24 06:53:40.587817 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Nov 24 06:53:40.587843 kernel: device-mapper: uevent: version 1.0.3 Nov 24 06:53:40.588966 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Nov 24 06:53:40.595669 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Nov 24 06:53:40.635301 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Nov 24 06:53:40.636846 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Nov 24 06:53:40.647450 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Nov 24 06:53:40.659644 kernel: BTRFS: device fsid 3af95a3e-5df6-49e0-91e3-ddf2109f68c7 devid 1 transid 35 /dev/mapper/usr (254:0) scanned by mount (668) Nov 24 06:53:40.659670 kernel: BTRFS info (device dm-0): first mount of filesystem 3af95a3e-5df6-49e0-91e3-ddf2109f68c7 Nov 24 06:53:40.659680 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Nov 24 06:53:40.669263 kernel: BTRFS info (device dm-0): enabling ssd optimizations Nov 24 06:53:40.669282 kernel: BTRFS info (device dm-0): disabling log replay at mount time Nov 24 06:53:40.669290 kernel: BTRFS info (device dm-0): enabling free space tree Nov 24 06:53:40.671223 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Nov 24 06:53:40.671492 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Nov 24 06:53:40.672048 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Nov 24 06:53:40.673712 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Nov 24 06:53:40.700673 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (691) Nov 24 06:53:40.705074 kernel: BTRFS info (device sda6): first mount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:53:40.705092 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Nov 24 06:53:40.709808 kernel: BTRFS info (device sda6): enabling ssd optimizations Nov 24 06:53:40.709825 kernel: BTRFS info (device sda6): enabling free space tree Nov 24 06:53:40.713670 kernel: BTRFS info (device sda6): last unmount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:53:40.716908 systemd[1]: Finished ignition-setup.service - Ignition (setup). Nov 24 06:53:40.717665 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Nov 24 06:53:40.749876 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Nov 24 06:53:40.750925 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Nov 24 06:53:40.821921 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Nov 24 06:53:40.824054 systemd[1]: Starting systemd-networkd.service - Network Configuration... Nov 24 06:53:40.842751 ignition[711]: Ignition 2.22.0 Nov 24 06:53:40.842758 ignition[711]: Stage: fetch-offline Nov 24 06:53:40.842776 ignition[711]: no configs at "/usr/lib/ignition/base.d" Nov 24 06:53:40.842782 ignition[711]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:53:40.842843 ignition[711]: parsed url from cmdline: "" Nov 24 06:53:40.842845 ignition[711]: no config URL provided Nov 24 06:53:40.842848 ignition[711]: reading system config file "/usr/lib/ignition/user.ign" Nov 24 06:53:40.842852 ignition[711]: no config at "/usr/lib/ignition/user.ign" Nov 24 06:53:40.843243 ignition[711]: config successfully fetched Nov 24 06:53:40.843263 ignition[711]: parsing config with SHA512: 25555f19ac6bf5a9a4d968a67e1704b37c55053cc4f3e3465dc2e6fe14a263bf957c4a4df61a6460c83976815405cb35241534eab37452302f634c30433adc5e Nov 24 06:53:40.845557 unknown[711]: fetched base config from "system" Nov 24 06:53:40.845563 unknown[711]: fetched user config from "vmware" Nov 24 06:53:40.845776 ignition[711]: fetch-offline: fetch-offline passed Nov 24 06:53:40.847170 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Nov 24 06:53:40.845814 ignition[711]: Ignition finished successfully Nov 24 06:53:40.853890 systemd-networkd[848]: lo: Link UP Nov 24 06:53:40.854044 systemd-networkd[848]: lo: Gained carrier Nov 24 06:53:40.854839 systemd-networkd[848]: Enumeration completed Nov 24 06:53:40.855018 systemd[1]: Started systemd-networkd.service - Network Configuration. Nov 24 06:53:40.855166 systemd[1]: Reached target network.target - Network. Nov 24 06:53:40.855301 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Nov 24 06:53:40.855549 systemd-networkd[848]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Nov 24 06:53:40.856830 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Nov 24 06:53:40.859674 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Nov 24 06:53:40.859828 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Nov 24 06:53:40.860090 systemd-networkd[848]: ens192: Link UP Nov 24 06:53:40.860096 systemd-networkd[848]: ens192: Gained carrier Nov 24 06:53:40.872442 ignition[852]: Ignition 2.22.0 Nov 24 06:53:40.872676 ignition[852]: Stage: kargs Nov 24 06:53:40.872752 ignition[852]: no configs at "/usr/lib/ignition/base.d" Nov 24 06:53:40.872757 ignition[852]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:53:40.873343 ignition[852]: kargs: kargs passed Nov 24 06:53:40.873367 ignition[852]: Ignition finished successfully Nov 24 06:53:40.874448 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Nov 24 06:53:40.875422 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Nov 24 06:53:40.893977 ignition[859]: Ignition 2.22.0 Nov 24 06:53:40.893984 ignition[859]: Stage: disks Nov 24 06:53:40.894063 ignition[859]: no configs at "/usr/lib/ignition/base.d" Nov 24 06:53:40.894069 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:53:40.895293 ignition[859]: disks: disks passed Nov 24 06:53:40.895422 ignition[859]: Ignition finished successfully Nov 24 06:53:40.896226 systemd[1]: Finished ignition-disks.service - Ignition (disks). Nov 24 06:53:40.896722 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Nov 24 06:53:40.896981 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Nov 24 06:53:40.897240 systemd[1]: Reached target local-fs.target - Local File Systems. Nov 24 06:53:40.897469 systemd[1]: Reached target sysinit.target - System Initialization. Nov 24 06:53:40.897711 systemd[1]: Reached target basic.target - Basic System. Nov 24 06:53:40.898421 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Nov 24 06:53:40.922089 systemd-fsck[867]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Nov 24 06:53:40.923277 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Nov 24 06:53:40.924190 systemd[1]: Mounting sysroot.mount - /sysroot... Nov 24 06:53:41.000666 kernel: EXT4-fs (sda9): mounted filesystem f89e2a65-2a4a-426b-9659-02844cc29a2a r/w with ordered data mode. Quota mode: none. Nov 24 06:53:41.000666 systemd[1]: Mounted sysroot.mount - /sysroot. Nov 24 06:53:41.000981 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Nov 24 06:53:41.001824 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Nov 24 06:53:41.003691 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Nov 24 06:53:41.004095 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Nov 24 06:53:41.004274 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Nov 24 06:53:41.004288 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Nov 24 06:53:41.012671 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Nov 24 06:53:41.013378 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Nov 24 06:53:41.017694 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (875) Nov 24 06:53:41.019723 kernel: BTRFS info (device sda6): first mount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:53:41.019739 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Nov 24 06:53:41.023020 kernel: BTRFS info (device sda6): enabling ssd optimizations Nov 24 06:53:41.023063 kernel: BTRFS info (device sda6): enabling free space tree Nov 24 06:53:41.024168 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Nov 24 06:53:41.047585 initrd-setup-root[900]: cut: /sysroot/etc/passwd: No such file or directory Nov 24 06:53:41.050608 initrd-setup-root[907]: cut: /sysroot/etc/group: No such file or directory Nov 24 06:53:41.052873 initrd-setup-root[914]: cut: /sysroot/etc/shadow: No such file or directory Nov 24 06:53:41.055131 initrd-setup-root[921]: cut: /sysroot/etc/gshadow: No such file or directory Nov 24 06:53:41.109620 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Nov 24 06:53:41.110415 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Nov 24 06:53:41.111734 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Nov 24 06:53:41.129669 kernel: BTRFS info (device sda6): last unmount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:53:41.142835 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Nov 24 06:53:41.151495 ignition[989]: INFO : Ignition 2.22.0 Nov 24 06:53:41.151495 ignition[989]: INFO : Stage: mount Nov 24 06:53:41.151851 ignition[989]: INFO : no configs at "/usr/lib/ignition/base.d" Nov 24 06:53:41.151851 ignition[989]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:53:41.152163 ignition[989]: INFO : mount: mount passed Nov 24 06:53:41.152745 ignition[989]: INFO : Ignition finished successfully Nov 24 06:53:41.152957 systemd[1]: Finished ignition-mount.service - Ignition (mount). Nov 24 06:53:41.153815 systemd[1]: Starting ignition-files.service - Ignition (files)... Nov 24 06:53:41.836608 systemd[1]: sysroot-oem.mount: Deactivated successfully. Nov 24 06:53:41.837852 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Nov 24 06:53:41.855599 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1000) Nov 24 06:53:41.855625 kernel: BTRFS info (device sda6): first mount of filesystem 1e21b02a-5e52-4507-8281-b06fd4c187c7 Nov 24 06:53:41.855641 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Nov 24 06:53:41.860157 kernel: BTRFS info (device sda6): enabling ssd optimizations Nov 24 06:53:41.860179 kernel: BTRFS info (device sda6): enabling free space tree Nov 24 06:53:41.861155 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Nov 24 06:53:41.883617 ignition[1016]: INFO : Ignition 2.22.0 Nov 24 06:53:41.883617 ignition[1016]: INFO : Stage: files Nov 24 06:53:41.884037 ignition[1016]: INFO : no configs at "/usr/lib/ignition/base.d" Nov 24 06:53:41.884037 ignition[1016]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:53:41.884718 ignition[1016]: DEBUG : files: compiled without relabeling support, skipping Nov 24 06:53:41.885281 ignition[1016]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Nov 24 06:53:41.885281 ignition[1016]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Nov 24 06:53:41.888006 ignition[1016]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Nov 24 06:53:41.888633 ignition[1016]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Nov 24 06:53:41.889043 unknown[1016]: wrote ssh authorized keys file for user: core Nov 24 06:53:41.889335 ignition[1016]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Nov 24 06:53:41.890954 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Nov 24 06:53:41.890954 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Nov 24 06:53:41.930914 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Nov 24 06:53:41.991952 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Nov 24 06:53:41.991952 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Nov 24 06:53:41.992509 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Nov 24 06:53:41.992509 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Nov 24 06:53:41.992509 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Nov 24 06:53:41.992509 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Nov 24 06:53:41.992509 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Nov 24 06:53:41.992509 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Nov 24 06:53:41.992509 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Nov 24 06:53:41.994111 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Nov 24 06:53:41.994111 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Nov 24 06:53:41.994111 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Nov 24 06:53:41.995806 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Nov 24 06:53:41.995806 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Nov 24 06:53:41.996403 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Nov 24 06:53:42.403790 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Nov 24 06:53:42.604970 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Nov 24 06:53:42.604970 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Nov 24 06:53:42.605827 ignition[1016]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Nov 24 06:53:42.605827 ignition[1016]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Nov 24 06:53:42.606267 ignition[1016]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Nov 24 06:53:42.606714 ignition[1016]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Nov 24 06:53:42.606714 ignition[1016]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Nov 24 06:53:42.606714 ignition[1016]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Nov 24 06:53:42.606714 ignition[1016]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Nov 24 06:53:42.606714 ignition[1016]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Nov 24 06:53:42.606714 ignition[1016]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Nov 24 06:53:42.606714 ignition[1016]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Nov 24 06:53:42.627802 ignition[1016]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Nov 24 06:53:42.630013 ignition[1016]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Nov 24 06:53:42.630013 ignition[1016]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Nov 24 06:53:42.630013 ignition[1016]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Nov 24 06:53:42.630013 ignition[1016]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Nov 24 06:53:42.630013 ignition[1016]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Nov 24 06:53:42.630013 ignition[1016]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Nov 24 06:53:42.630013 ignition[1016]: INFO : files: files passed Nov 24 06:53:42.630013 ignition[1016]: INFO : Ignition finished successfully Nov 24 06:53:42.632081 systemd[1]: Finished ignition-files.service - Ignition (files). Nov 24 06:53:42.632877 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Nov 24 06:53:42.633716 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Nov 24 06:53:42.640376 systemd[1]: ignition-quench.service: Deactivated successfully. Nov 24 06:53:42.640439 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Nov 24 06:53:42.642604 initrd-setup-root-after-ignition[1049]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Nov 24 06:53:42.642604 initrd-setup-root-after-ignition[1049]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Nov 24 06:53:42.643690 initrd-setup-root-after-ignition[1053]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Nov 24 06:53:42.644622 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Nov 24 06:53:42.644835 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Nov 24 06:53:42.645335 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Nov 24 06:53:42.681273 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Nov 24 06:53:42.681366 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Nov 24 06:53:42.681736 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Nov 24 06:53:42.681890 systemd[1]: Reached target initrd.target - Initrd Default Target. Nov 24 06:53:42.682376 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Nov 24 06:53:42.682980 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Nov 24 06:53:42.701246 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Nov 24 06:53:42.702388 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Nov 24 06:53:42.713889 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Nov 24 06:53:42.714172 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Nov 24 06:53:42.714508 systemd[1]: Stopped target timers.target - Timer Units. Nov 24 06:53:42.714781 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Nov 24 06:53:42.714949 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Nov 24 06:53:42.715320 systemd[1]: Stopped target initrd.target - Initrd Default Target. Nov 24 06:53:42.715572 systemd[1]: Stopped target basic.target - Basic System. Nov 24 06:53:42.715821 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Nov 24 06:53:42.716080 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Nov 24 06:53:42.716324 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Nov 24 06:53:42.716599 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Nov 24 06:53:42.716858 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Nov 24 06:53:42.716982 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Nov 24 06:53:42.717124 systemd[1]: Stopped target sysinit.target - System Initialization. Nov 24 06:53:42.717251 systemd[1]: Stopped target local-fs.target - Local File Systems. Nov 24 06:53:42.717373 systemd[1]: Stopped target swap.target - Swaps. Nov 24 06:53:42.717468 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Nov 24 06:53:42.717533 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Nov 24 06:53:42.717722 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Nov 24 06:53:42.717883 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Nov 24 06:53:42.717999 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Nov 24 06:53:42.719159 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Nov 24 06:53:42.719411 systemd[1]: dracut-initqueue.service: Deactivated successfully. Nov 24 06:53:42.719481 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Nov 24 06:53:42.719921 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Nov 24 06:53:42.720090 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Nov 24 06:53:42.720387 systemd[1]: Stopped target paths.target - Path Units. Nov 24 06:53:42.720618 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Nov 24 06:53:42.720813 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Nov 24 06:53:42.721090 systemd[1]: Stopped target slices.target - Slice Units. Nov 24 06:53:42.721343 systemd[1]: Stopped target sockets.target - Socket Units. Nov 24 06:53:42.721576 systemd[1]: iscsid.socket: Deactivated successfully. Nov 24 06:53:42.721725 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Nov 24 06:53:42.721961 systemd[1]: iscsiuio.socket: Deactivated successfully. Nov 24 06:53:42.722009 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Nov 24 06:53:42.722370 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Nov 24 06:53:42.722444 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Nov 24 06:53:42.722858 systemd[1]: ignition-files.service: Deactivated successfully. Nov 24 06:53:42.722918 systemd[1]: Stopped ignition-files.service - Ignition (files). Nov 24 06:53:42.723681 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Nov 24 06:53:42.725737 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Nov 24 06:53:42.725952 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Nov 24 06:53:42.726121 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Nov 24 06:53:42.726414 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Nov 24 06:53:42.726573 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Nov 24 06:53:42.728867 systemd[1]: initrd-cleanup.service: Deactivated successfully. Nov 24 06:53:42.729035 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Nov 24 06:53:42.737902 systemd[1]: sysroot-boot.mount: Deactivated successfully. Nov 24 06:53:42.740005 ignition[1073]: INFO : Ignition 2.22.0 Nov 24 06:53:42.740005 ignition[1073]: INFO : Stage: umount Nov 24 06:53:42.740260 ignition[1073]: INFO : no configs at "/usr/lib/ignition/base.d" Nov 24 06:53:42.740260 ignition[1073]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Nov 24 06:53:42.740933 ignition[1073]: INFO : umount: umount passed Nov 24 06:53:42.740933 ignition[1073]: INFO : Ignition finished successfully Nov 24 06:53:42.741934 systemd[1]: ignition-mount.service: Deactivated successfully. Nov 24 06:53:42.742176 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Nov 24 06:53:42.742640 systemd[1]: Stopped target network.target - Network. Nov 24 06:53:42.742976 systemd[1]: ignition-disks.service: Deactivated successfully. Nov 24 06:53:42.743108 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Nov 24 06:53:42.743340 systemd[1]: ignition-kargs.service: Deactivated successfully. Nov 24 06:53:42.743363 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Nov 24 06:53:42.743704 systemd[1]: ignition-setup.service: Deactivated successfully. Nov 24 06:53:42.743725 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Nov 24 06:53:42.743966 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Nov 24 06:53:42.743986 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Nov 24 06:53:42.744298 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Nov 24 06:53:42.744708 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Nov 24 06:53:42.751521 systemd[1]: systemd-networkd.service: Deactivated successfully. Nov 24 06:53:42.751585 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Nov 24 06:53:42.752444 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Nov 24 06:53:42.752539 systemd[1]: Stopped target network-pre.target - Preparation for Network. Nov 24 06:53:42.752719 systemd[1]: systemd-networkd.socket: Deactivated successfully. Nov 24 06:53:42.752737 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Nov 24 06:53:42.753706 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Nov 24 06:53:42.753817 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Nov 24 06:53:42.753845 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Nov 24 06:53:42.754015 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Nov 24 06:53:42.754037 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Nov 24 06:53:42.754206 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Nov 24 06:53:42.755037 systemd[1]: systemd-resolved.service: Deactivated successfully. Nov 24 06:53:42.756037 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Nov 24 06:53:42.758411 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Nov 24 06:53:42.758681 systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 24 06:53:42.758725 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Nov 24 06:53:42.759384 systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 24 06:53:42.759409 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Nov 24 06:53:42.759571 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Nov 24 06:53:42.759594 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Nov 24 06:53:42.766305 systemd[1]: systemd-udevd.service: Deactivated successfully. Nov 24 06:53:42.766370 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Nov 24 06:53:42.766715 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Nov 24 06:53:42.766747 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Nov 24 06:53:42.766976 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Nov 24 06:53:42.766991 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Nov 24 06:53:42.767139 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Nov 24 06:53:42.767161 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Nov 24 06:53:42.767436 systemd[1]: dracut-cmdline.service: Deactivated successfully. Nov 24 06:53:42.767460 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Nov 24 06:53:42.767810 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Nov 24 06:53:42.767833 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Nov 24 06:53:42.769711 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Nov 24 06:53:42.769884 systemd[1]: systemd-network-generator.service: Deactivated successfully. Nov 24 06:53:42.769912 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Nov 24 06:53:42.770348 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Nov 24 06:53:42.770377 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Nov 24 06:53:42.770699 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Nov 24 06:53:42.770726 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Nov 24 06:53:42.771052 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Nov 24 06:53:42.771083 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Nov 24 06:53:42.771286 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Nov 24 06:53:42.771308 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:53:42.782321 systemd[1]: network-cleanup.service: Deactivated successfully. Nov 24 06:53:42.782623 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Nov 24 06:53:42.783880 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Nov 24 06:53:42.783970 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Nov 24 06:53:42.792402 systemd[1]: sysroot-boot.service: Deactivated successfully. Nov 24 06:53:42.792483 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Nov 24 06:53:42.793001 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Nov 24 06:53:42.793153 systemd[1]: initrd-setup-root.service: Deactivated successfully. Nov 24 06:53:42.793194 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Nov 24 06:53:42.793978 systemd[1]: Starting initrd-switch-root.service - Switch Root... Nov 24 06:53:42.812814 systemd[1]: Switching root. Nov 24 06:53:42.848882 systemd-journald[225]: Journal stopped Nov 24 06:53:43.963475 systemd-journald[225]: Received SIGTERM from PID 1 (systemd). Nov 24 06:53:43.963492 kernel: SELinux: policy capability network_peer_controls=1 Nov 24 06:53:43.963500 kernel: SELinux: policy capability open_perms=1 Nov 24 06:53:43.963506 kernel: SELinux: policy capability extended_socket_class=1 Nov 24 06:53:43.963510 kernel: SELinux: policy capability always_check_network=0 Nov 24 06:53:43.963516 kernel: SELinux: policy capability cgroup_seclabel=1 Nov 24 06:53:43.963521 kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 24 06:53:43.963528 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Nov 24 06:53:43.963533 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Nov 24 06:53:43.963539 kernel: SELinux: policy capability userspace_initial_context=0 Nov 24 06:53:43.963544 kernel: audit: type=1403 audit(1763967223.452:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Nov 24 06:53:43.963550 systemd[1]: Successfully loaded SELinux policy in 38.506ms. Nov 24 06:53:43.963557 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.831ms. Nov 24 06:53:43.963564 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Nov 24 06:53:43.963571 systemd[1]: Detected virtualization vmware. Nov 24 06:53:43.963578 systemd[1]: Detected architecture x86-64. Nov 24 06:53:43.963584 systemd[1]: Detected first boot. Nov 24 06:53:43.963590 systemd[1]: Initializing machine ID from random generator. Nov 24 06:53:43.963597 zram_generator::config[1117]: No configuration found. Nov 24 06:53:43.963680 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Nov 24 06:53:43.963691 kernel: Guest personality initialized and is active Nov 24 06:53:43.963697 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Nov 24 06:53:43.963703 kernel: Initialized host personality Nov 24 06:53:43.963717 kernel: NET: Registered PF_VSOCK protocol family Nov 24 06:53:43.963726 systemd[1]: Populated /etc with preset unit settings. Nov 24 06:53:43.963734 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:53:43.963759 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Nov 24 06:53:43.963765 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Nov 24 06:53:43.963771 systemd[1]: initrd-switch-root.service: Deactivated successfully. Nov 24 06:53:43.963777 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Nov 24 06:53:43.963799 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Nov 24 06:53:43.963807 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Nov 24 06:53:43.963813 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Nov 24 06:53:43.963819 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Nov 24 06:53:43.963825 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Nov 24 06:53:43.963832 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Nov 24 06:53:43.963838 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Nov 24 06:53:43.963844 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Nov 24 06:53:43.963850 systemd[1]: Created slice user.slice - User and Session Slice. Nov 24 06:53:43.963857 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Nov 24 06:53:43.963864 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Nov 24 06:53:43.963872 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Nov 24 06:53:43.963878 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Nov 24 06:53:43.963885 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Nov 24 06:53:43.963891 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Nov 24 06:53:43.963898 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Nov 24 06:53:43.963904 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Nov 24 06:53:43.963911 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Nov 24 06:53:43.963918 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Nov 24 06:53:43.963924 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Nov 24 06:53:43.963930 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Nov 24 06:53:43.963937 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Nov 24 06:53:43.963943 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Nov 24 06:53:43.963950 systemd[1]: Reached target remote-fs.target - Remote File Systems. Nov 24 06:53:43.963956 systemd[1]: Reached target slices.target - Slice Units. Nov 24 06:53:43.963963 systemd[1]: Reached target swap.target - Swaps. Nov 24 06:53:43.963970 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Nov 24 06:53:43.963977 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Nov 24 06:53:43.963983 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Nov 24 06:53:43.963990 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Nov 24 06:53:43.963997 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Nov 24 06:53:43.964004 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Nov 24 06:53:43.964010 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Nov 24 06:53:43.964017 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Nov 24 06:53:43.964023 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Nov 24 06:53:43.964030 systemd[1]: Mounting media.mount - External Media Directory... Nov 24 06:53:43.964036 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:53:43.964042 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Nov 24 06:53:43.964050 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Nov 24 06:53:43.964057 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Nov 24 06:53:43.964064 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Nov 24 06:53:43.964070 systemd[1]: Reached target machines.target - Containers. Nov 24 06:53:43.964077 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Nov 24 06:53:43.964083 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Nov 24 06:53:43.964089 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Nov 24 06:53:43.964096 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Nov 24 06:53:43.964103 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Nov 24 06:53:43.964110 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Nov 24 06:53:43.964116 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Nov 24 06:53:43.964123 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Nov 24 06:53:43.964129 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Nov 24 06:53:43.964135 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Nov 24 06:53:43.964142 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Nov 24 06:53:43.964148 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Nov 24 06:53:43.964155 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Nov 24 06:53:43.964162 systemd[1]: Stopped systemd-fsck-usr.service. Nov 24 06:53:43.964169 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Nov 24 06:53:43.964176 systemd[1]: Starting systemd-journald.service - Journal Service... Nov 24 06:53:43.964182 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Nov 24 06:53:43.964189 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Nov 24 06:53:43.964195 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Nov 24 06:53:43.964201 kernel: fuse: init (API version 7.41) Nov 24 06:53:43.964207 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Nov 24 06:53:43.964215 kernel: loop: module loaded Nov 24 06:53:43.964221 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Nov 24 06:53:43.964227 systemd[1]: verity-setup.service: Deactivated successfully. Nov 24 06:53:43.964233 systemd[1]: Stopped verity-setup.service. Nov 24 06:53:43.964240 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:53:43.964247 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Nov 24 06:53:43.964253 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Nov 24 06:53:43.964259 systemd[1]: Mounted media.mount - External Media Directory. Nov 24 06:53:43.964265 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Nov 24 06:53:43.964273 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Nov 24 06:53:43.964280 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Nov 24 06:53:43.964287 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Nov 24 06:53:43.964293 systemd[1]: modprobe@configfs.service: Deactivated successfully. Nov 24 06:53:43.964299 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Nov 24 06:53:43.964306 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Nov 24 06:53:43.964312 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Nov 24 06:53:43.964319 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Nov 24 06:53:43.964327 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Nov 24 06:53:43.964333 kernel: ACPI: bus type drm_connector registered Nov 24 06:53:43.964339 systemd[1]: modprobe@drm.service: Deactivated successfully. Nov 24 06:53:43.964345 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Nov 24 06:53:43.964352 systemd[1]: modprobe@fuse.service: Deactivated successfully. Nov 24 06:53:43.964358 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Nov 24 06:53:43.964376 systemd-journald[1205]: Collecting audit messages is disabled. Nov 24 06:53:43.964392 systemd[1]: modprobe@loop.service: Deactivated successfully. Nov 24 06:53:43.964402 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Nov 24 06:53:43.964410 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Nov 24 06:53:43.964418 systemd-journald[1205]: Journal started Nov 24 06:53:43.964431 systemd-journald[1205]: Runtime Journal (/run/log/journal/291bea3c7d9f46148c48b770ccf82466) is 4.8M, max 38.5M, 33.7M free. Nov 24 06:53:43.779886 systemd[1]: Queued start job for default target multi-user.target. Nov 24 06:53:43.796156 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Nov 24 06:53:43.796412 systemd[1]: systemd-journald.service: Deactivated successfully. Nov 24 06:53:43.964950 jq[1187]: true Nov 24 06:53:43.965453 jq[1218]: true Nov 24 06:53:43.965682 systemd[1]: Started systemd-journald.service - Journal Service. Nov 24 06:53:43.966713 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Nov 24 06:53:43.966962 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Nov 24 06:53:43.967197 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Nov 24 06:53:43.976967 systemd[1]: Reached target network-pre.target - Preparation for Network. Nov 24 06:53:43.979952 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Nov 24 06:53:43.982359 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Nov 24 06:53:43.982466 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Nov 24 06:53:43.982485 systemd[1]: Reached target local-fs.target - Local File Systems. Nov 24 06:53:43.983133 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Nov 24 06:53:43.988985 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Nov 24 06:53:43.989130 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 24 06:53:43.990957 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Nov 24 06:53:43.991694 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Nov 24 06:53:43.991847 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Nov 24 06:53:43.994167 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Nov 24 06:53:43.994277 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Nov 24 06:53:43.996316 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Nov 24 06:53:43.998984 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Nov 24 06:53:44.003166 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Nov 24 06:53:44.010141 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Nov 24 06:53:44.010449 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Nov 24 06:53:44.010791 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Nov 24 06:53:44.020301 systemd-journald[1205]: Time spent on flushing to /var/log/journal/291bea3c7d9f46148c48b770ccf82466 is 81.471ms for 1756 entries. Nov 24 06:53:44.020301 systemd-journald[1205]: System Journal (/var/log/journal/291bea3c7d9f46148c48b770ccf82466) is 8M, max 584.8M, 576.8M free. Nov 24 06:53:44.122288 systemd-journald[1205]: Received client request to flush runtime journal. Nov 24 06:53:44.122330 kernel: loop0: detected capacity change from 0 to 2960 Nov 24 06:53:44.122347 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Nov 24 06:53:44.122358 kernel: loop1: detected capacity change from 0 to 110984 Nov 24 06:53:44.022013 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Nov 24 06:53:44.022256 ignition[1241]: Ignition 2.22.0 Nov 24 06:53:44.022266 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Nov 24 06:53:44.022419 ignition[1241]: deleting config from guestinfo properties Nov 24 06:53:44.027988 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Nov 24 06:53:44.024568 ignition[1241]: Successfully deleted config Nov 24 06:53:44.028351 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Nov 24 06:53:44.074376 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Nov 24 06:53:44.075003 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. Nov 24 06:53:44.075012 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. Nov 24 06:53:44.075942 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Nov 24 06:53:44.084094 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Nov 24 06:53:44.086910 systemd[1]: Starting systemd-sysusers.service - Create System Users... Nov 24 06:53:44.125848 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Nov 24 06:53:44.134786 systemd[1]: Finished systemd-sysusers.service - Create System Users. Nov 24 06:53:44.137628 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Nov 24 06:53:44.162880 kernel: loop2: detected capacity change from 0 to 128560 Nov 24 06:53:44.163244 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Nov 24 06:53:44.163255 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Nov 24 06:53:44.168258 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Nov 24 06:53:44.174684 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Nov 24 06:53:44.192680 kernel: loop3: detected capacity change from 0 to 229808 Nov 24 06:53:44.239724 kernel: loop4: detected capacity change from 0 to 2960 Nov 24 06:53:44.251706 kernel: loop5: detected capacity change from 0 to 110984 Nov 24 06:53:44.277676 kernel: loop6: detected capacity change from 0 to 128560 Nov 24 06:53:44.296683 kernel: loop7: detected capacity change from 0 to 229808 Nov 24 06:53:44.324425 (sd-merge)[1295]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Nov 24 06:53:44.325266 (sd-merge)[1295]: Merged extensions into '/usr'. Nov 24 06:53:44.332117 systemd[1]: Reload requested from client PID 1265 ('systemd-sysext') (unit systemd-sysext.service)... Nov 24 06:53:44.332129 systemd[1]: Reloading... Nov 24 06:53:44.379680 zram_generator::config[1317]: No configuration found. Nov 24 06:53:44.482224 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:53:44.539761 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Nov 24 06:53:44.540453 systemd[1]: Reloading finished in 208 ms. Nov 24 06:53:44.567789 ldconfig[1256]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Nov 24 06:53:44.572369 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Nov 24 06:53:44.572772 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Nov 24 06:53:44.573119 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Nov 24 06:53:44.577626 systemd[1]: Starting ensure-sysext.service... Nov 24 06:53:44.580086 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Nov 24 06:53:44.581277 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Nov 24 06:53:44.587855 systemd[1]: Reload requested from client PID 1378 ('systemctl') (unit ensure-sysext.service)... Nov 24 06:53:44.587865 systemd[1]: Reloading... Nov 24 06:53:44.603782 systemd-tmpfiles[1379]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Nov 24 06:53:44.603803 systemd-tmpfiles[1379]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Nov 24 06:53:44.603960 systemd-tmpfiles[1379]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Nov 24 06:53:44.604116 systemd-tmpfiles[1379]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Nov 24 06:53:44.604173 systemd-udevd[1380]: Using default interface naming scheme 'v255'. Nov 24 06:53:44.604592 systemd-tmpfiles[1379]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Nov 24 06:53:44.604772 systemd-tmpfiles[1379]: ACLs are not supported, ignoring. Nov 24 06:53:44.604816 systemd-tmpfiles[1379]: ACLs are not supported, ignoring. Nov 24 06:53:44.606565 systemd-tmpfiles[1379]: Detected autofs mount point /boot during canonicalization of boot. Nov 24 06:53:44.606570 systemd-tmpfiles[1379]: Skipping /boot Nov 24 06:53:44.615345 systemd-tmpfiles[1379]: Detected autofs mount point /boot during canonicalization of boot. Nov 24 06:53:44.615353 systemd-tmpfiles[1379]: Skipping /boot Nov 24 06:53:44.626674 zram_generator::config[1403]: No configuration found. Nov 24 06:53:44.736570 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:53:44.769670 kernel: mousedev: PS/2 mouse device common for all mice Nov 24 06:53:44.786673 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Nov 24 06:53:44.792675 kernel: ACPI: button: Power Button [PWRF] Nov 24 06:53:44.808241 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Nov 24 06:53:44.808422 systemd[1]: Reloading finished in 220 ms. Nov 24 06:53:44.816394 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Nov 24 06:53:44.821527 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Nov 24 06:53:44.835738 systemd[1]: Starting audit-rules.service - Load Audit Rules... Nov 24 06:53:44.837578 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Nov 24 06:53:44.839092 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Nov 24 06:53:44.841321 systemd[1]: Starting systemd-networkd.service - Network Configuration... Nov 24 06:53:44.846392 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Nov 24 06:53:44.847883 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Nov 24 06:53:44.854832 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Nov 24 06:53:44.857633 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:53:44.859857 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Nov 24 06:53:44.860803 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Nov 24 06:53:44.862736 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Nov 24 06:53:44.862928 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 24 06:53:44.862992 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Nov 24 06:53:44.863056 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:53:44.867687 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Nov 24 06:53:44.869387 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:53:44.869506 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 24 06:53:44.869569 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Nov 24 06:53:44.869627 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:53:44.873885 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Nov 24 06:53:44.874868 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Nov 24 06:53:44.880886 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:53:44.882941 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Nov 24 06:53:44.883137 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 24 06:53:44.883157 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Nov 24 06:53:44.883189 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Nov 24 06:53:44.883223 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Nov 24 06:53:44.883559 systemd[1]: Finished ensure-sysext.service. Nov 24 06:53:44.888790 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Nov 24 06:53:44.905697 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Nov 24 06:53:44.908938 systemd[1]: Starting systemd-update-done.service - Update is Completed... Nov 24 06:53:44.915246 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Nov 24 06:53:44.919901 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Nov 24 06:53:44.920225 systemd[1]: modprobe@loop.service: Deactivated successfully. Nov 24 06:53:44.920547 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Nov 24 06:53:44.920892 systemd[1]: modprobe@drm.service: Deactivated successfully. Nov 24 06:53:44.921002 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Nov 24 06:53:44.923159 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Nov 24 06:53:44.929142 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Nov 24 06:53:44.930801 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Nov 24 06:53:44.932741 systemd[1]: Started systemd-userdbd.service - User Database Manager. Nov 24 06:53:44.933049 systemd[1]: Finished systemd-update-done.service - Update is Completed. Nov 24 06:53:44.933806 augenrules[1541]: No rules Nov 24 06:53:44.934509 systemd[1]: audit-rules.service: Deactivated successfully. Nov 24 06:53:44.934638 systemd[1]: Finished audit-rules.service - Load Audit Rules. Nov 24 06:53:44.945940 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Nov 24 06:53:44.993378 systemd-resolved[1503]: Positive Trust Anchors: Nov 24 06:53:44.993385 systemd-resolved[1503]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Nov 24 06:53:44.993408 systemd-resolved[1503]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Nov 24 06:53:45.001677 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Nov 24 06:53:45.006795 systemd-resolved[1503]: Defaulting to hostname 'linux'. Nov 24 06:53:45.007779 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Nov 24 06:53:45.008057 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Nov 24 06:53:45.013062 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Nov 24 06:53:45.013282 systemd[1]: Reached target time-set.target - System Time Set. Nov 24 06:53:45.021049 systemd-networkd[1502]: lo: Link UP Nov 24 06:53:45.021054 systemd-networkd[1502]: lo: Gained carrier Nov 24 06:53:45.021866 systemd-networkd[1502]: Enumeration completed Nov 24 06:53:45.022079 systemd-networkd[1502]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Nov 24 06:53:45.022754 systemd[1]: Started systemd-networkd.service - Network Configuration. Nov 24 06:53:45.022937 systemd[1]: Reached target network.target - Network. Nov 24 06:53:45.024278 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Nov 24 06:53:45.026695 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Nov 24 06:53:45.026846 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Nov 24 06:53:45.027019 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Nov 24 06:53:45.028108 systemd-networkd[1502]: ens192: Link UP Nov 24 06:53:45.028206 systemd-networkd[1502]: ens192: Gained carrier Nov 24 06:53:45.032184 systemd-timesyncd[1520]: Network configuration changed, trying to establish connection. Nov 24 06:53:45.045636 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Nov 24 06:53:45.063689 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Nov 24 06:53:45.063954 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Nov 24 06:53:45.063981 systemd[1]: Reached target sysinit.target - System Initialization. Nov 24 06:53:45.064159 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Nov 24 06:53:45.064294 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Nov 24 06:53:45.064414 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Nov 24 06:53:45.064600 systemd[1]: Started logrotate.timer - Daily rotation of log files. Nov 24 06:53:45.064766 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Nov 24 06:53:45.064884 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Nov 24 06:53:45.064999 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Nov 24 06:53:45.065017 systemd[1]: Reached target paths.target - Path Units. Nov 24 06:53:45.065113 systemd[1]: Reached target timers.target - Timer Units. Nov 24 06:53:45.065970 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Nov 24 06:53:45.066936 systemd[1]: Starting docker.socket - Docker Socket for the API... Nov 24 06:53:45.069058 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Nov 24 06:53:45.069774 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Nov 24 06:53:45.069916 systemd[1]: Reached target ssh-access.target - SSH Access Available. Nov 24 06:53:45.074029 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Nov 24 06:53:45.074842 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Nov 24 06:53:45.075334 systemd[1]: Listening on docker.socket - Docker Socket for the API. Nov 24 06:53:45.082240 systemd[1]: Reached target sockets.target - Socket Units. Nov 24 06:53:45.082400 systemd[1]: Reached target basic.target - Basic System. Nov 24 06:53:45.082557 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Nov 24 06:53:45.082621 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Nov 24 06:53:45.083792 systemd[1]: Starting containerd.service - containerd container runtime... Nov 24 06:53:45.086520 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Nov 24 06:53:45.089767 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Nov 24 06:53:45.090396 (udev-worker)[1446]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Nov 24 06:53:45.092753 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Nov 24 06:53:45.094720 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Nov 24 06:53:45.094836 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Nov 24 06:53:45.100334 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Nov 24 06:53:45.102814 jq[1581]: false Nov 24 06:53:45.105884 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Nov 24 06:53:45.107775 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Nov 24 06:53:45.110144 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Nov 24 06:53:45.114813 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Nov 24 06:53:45.117996 systemd[1]: Starting systemd-logind.service - User Login Management... Nov 24 06:53:45.118916 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Nov 24 06:53:45.119776 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Nov 24 06:53:45.120314 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Nov 24 06:53:45.126766 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Refreshing passwd entry cache Nov 24 06:53:45.124859 systemd[1]: Starting update-engine.service - Update Engine... Nov 24 06:53:45.124596 oslogin_cache_refresh[1583]: Refreshing passwd entry cache Nov 24 06:53:45.128871 extend-filesystems[1582]: Found /dev/sda6 Nov 24 06:53:45.129935 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Nov 24 06:53:45.136984 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Failure getting users, quitting Nov 24 06:53:45.136984 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Nov 24 06:53:45.136984 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Refreshing group entry cache Nov 24 06:53:45.134236 oslogin_cache_refresh[1583]: Failure getting users, quitting Nov 24 06:53:45.134248 oslogin_cache_refresh[1583]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Nov 24 06:53:45.134275 oslogin_cache_refresh[1583]: Refreshing group entry cache Nov 24 06:53:45.137343 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Nov 24 06:53:45.142783 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Failure getting groups, quitting Nov 24 06:53:45.142783 google_oslogin_nss_cache[1583]: oslogin_cache_refresh[1583]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Nov 24 06:53:45.137463 oslogin_cache_refresh[1583]: Failure getting groups, quitting Nov 24 06:53:45.140098 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Nov 24 06:53:45.137471 oslogin_cache_refresh[1583]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Nov 24 06:53:45.140436 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Nov 24 06:53:45.141702 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Nov 24 06:53:45.142027 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Nov 24 06:53:45.142179 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Nov 24 06:53:45.151583 extend-filesystems[1582]: Found /dev/sda9 Nov 24 06:53:45.153138 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Nov 24 06:53:45.155272 extend-filesystems[1582]: Checking size of /dev/sda9 Nov 24 06:53:45.153294 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Nov 24 06:53:45.156113 systemd[1]: motdgen.service: Deactivated successfully. Nov 24 06:53:45.156347 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Nov 24 06:53:45.164048 jq[1600]: true Nov 24 06:53:45.172675 extend-filesystems[1582]: Old size kept for /dev/sda9 Nov 24 06:53:45.173373 systemd[1]: extend-filesystems.service: Deactivated successfully. Nov 24 06:53:45.173531 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Nov 24 06:53:45.175591 update_engine[1595]: I20251124 06:53:45.175546 1595 main.cc:92] Flatcar Update Engine starting Nov 24 06:53:45.179088 (ntainerd)[1621]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Nov 24 06:53:45.182786 jq[1620]: true Nov 24 06:53:45.204155 dbus-daemon[1579]: [system] SELinux support is enabled Nov 24 06:53:45.208584 update_engine[1595]: I20251124 06:53:45.208415 1595 update_check_scheduler.cc:74] Next update check in 6m18s Nov 24 06:53:45.278836 systemd-logind[1591]: Watching system buttons on /dev/input/event2 (Power Button) Nov 24 06:53:45.278850 systemd-logind[1591]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Nov 24 06:53:45.279070 systemd-logind[1591]: New seat seat0. Nov 24 06:53:45.358797 systemd[1]: Started dbus.service - D-Bus System Message Bus. Nov 24 06:53:45.361398 systemd[1]: Started systemd-logind.service - User Login Management. Nov 24 06:53:45.361885 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Nov 24 06:53:45.363734 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Nov 24 06:53:45.369693 tar[1609]: linux-amd64/LICENSE Nov 24 06:53:45.371333 dbus-daemon[1579]: [system] Successfully activated service 'org.freedesktop.systemd1' Nov 24 06:53:45.376697 tar[1609]: linux-amd64/helm Nov 24 06:53:45.380777 systemd[1]: Started update-engine.service - Update Engine. Nov 24 06:53:45.381597 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Nov 24 06:53:45.381830 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Nov 24 06:53:45.382165 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Nov 24 06:53:45.382236 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Nov 24 06:53:45.383964 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Nov 24 06:53:45.388325 systemd[1]: Started locksmithd.service - Cluster reboot manager. Nov 24 06:53:45.409968 bash[1661]: Updated "/home/core/.ssh/authorized_keys" Nov 24 06:53:45.412990 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Nov 24 06:53:45.414336 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Nov 24 06:53:45.422460 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Nov 24 06:53:45.433256 unknown[1641]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Nov 24 06:53:45.438066 unknown[1641]: Core dump limit set to -1 Nov 24 06:53:45.470291 containerd[1621]: time="2025-11-24T06:53:45Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Nov 24 06:53:45.472248 containerd[1621]: time="2025-11-24T06:53:45.472232398Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Nov 24 06:53:45.483556 sshd_keygen[1626]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Nov 24 06:53:45.494008 containerd[1621]: time="2025-11-24T06:53:45.493977047Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.138µs" Nov 24 06:53:45.494008 containerd[1621]: time="2025-11-24T06:53:45.494000586Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Nov 24 06:53:45.494008 containerd[1621]: time="2025-11-24T06:53:45.494012467Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Nov 24 06:53:45.494120 containerd[1621]: time="2025-11-24T06:53:45.494107282Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Nov 24 06:53:45.494120 containerd[1621]: time="2025-11-24T06:53:45.494119263Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Nov 24 06:53:45.494164 containerd[1621]: time="2025-11-24T06:53:45.494133756Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494180 containerd[1621]: time="2025-11-24T06:53:45.494171330Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494180 containerd[1621]: time="2025-11-24T06:53:45.494178823Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494324 containerd[1621]: time="2025-11-24T06:53:45.494310225Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494324 containerd[1621]: time="2025-11-24T06:53:45.494321099Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494369 containerd[1621]: time="2025-11-24T06:53:45.494330753Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494369 containerd[1621]: time="2025-11-24T06:53:45.494335960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494411 containerd[1621]: time="2025-11-24T06:53:45.494380268Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494503 containerd[1621]: time="2025-11-24T06:53:45.494490910Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494525 containerd[1621]: time="2025-11-24T06:53:45.494511128Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Nov 24 06:53:45.494525 containerd[1621]: time="2025-11-24T06:53:45.494519287Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Nov 24 06:53:45.494563 containerd[1621]: time="2025-11-24T06:53:45.494536680Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Nov 24 06:53:45.497060 containerd[1621]: time="2025-11-24T06:53:45.497038965Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Nov 24 06:53:45.497092 containerd[1621]: time="2025-11-24T06:53:45.497084986Z" level=info msg="metadata content store policy set" policy=shared Nov 24 06:53:45.501328 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Nov 24 06:53:45.505381 systemd[1]: Starting issuegen.service - Generate /run/issue... Nov 24 06:53:45.509903 containerd[1621]: time="2025-11-24T06:53:45.509870794Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Nov 24 06:53:45.509967 containerd[1621]: time="2025-11-24T06:53:45.509918225Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Nov 24 06:53:45.509967 containerd[1621]: time="2025-11-24T06:53:45.509933249Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Nov 24 06:53:45.509967 containerd[1621]: time="2025-11-24T06:53:45.509944526Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Nov 24 06:53:45.509967 containerd[1621]: time="2025-11-24T06:53:45.509953980Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Nov 24 06:53:45.509967 containerd[1621]: time="2025-11-24T06:53:45.509964157Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Nov 24 06:53:45.510058 containerd[1621]: time="2025-11-24T06:53:45.509972112Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Nov 24 06:53:45.510058 containerd[1621]: time="2025-11-24T06:53:45.509978998Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Nov 24 06:53:45.510058 containerd[1621]: time="2025-11-24T06:53:45.509986811Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Nov 24 06:53:45.510058 containerd[1621]: time="2025-11-24T06:53:45.509995057Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Nov 24 06:53:45.510058 containerd[1621]: time="2025-11-24T06:53:45.510000786Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Nov 24 06:53:45.510058 containerd[1621]: time="2025-11-24T06:53:45.510013082Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Nov 24 06:53:45.510167 containerd[1621]: time="2025-11-24T06:53:45.510111257Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Nov 24 06:53:45.510167 containerd[1621]: time="2025-11-24T06:53:45.510123998Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Nov 24 06:53:45.510167 containerd[1621]: time="2025-11-24T06:53:45.510134039Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Nov 24 06:53:45.510167 containerd[1621]: time="2025-11-24T06:53:45.510143633Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Nov 24 06:53:45.510167 containerd[1621]: time="2025-11-24T06:53:45.510153258Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Nov 24 06:53:45.510167 containerd[1621]: time="2025-11-24T06:53:45.510162543Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Nov 24 06:53:45.510260 containerd[1621]: time="2025-11-24T06:53:45.510174691Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Nov 24 06:53:45.510260 containerd[1621]: time="2025-11-24T06:53:45.510184761Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Nov 24 06:53:45.510260 containerd[1621]: time="2025-11-24T06:53:45.510195404Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Nov 24 06:53:45.510260 containerd[1621]: time="2025-11-24T06:53:45.510204037Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Nov 24 06:53:45.510260 containerd[1621]: time="2025-11-24T06:53:45.510210030Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Nov 24 06:53:45.510260 containerd[1621]: time="2025-11-24T06:53:45.510244173Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Nov 24 06:53:45.510260 containerd[1621]: time="2025-11-24T06:53:45.510255148Z" level=info msg="Start snapshots syncer" Nov 24 06:53:45.510372 containerd[1621]: time="2025-11-24T06:53:45.510276781Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Nov 24 06:53:45.510545 containerd[1621]: time="2025-11-24T06:53:45.510513082Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Nov 24 06:53:45.510624 containerd[1621]: time="2025-11-24T06:53:45.510555106Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Nov 24 06:53:45.510624 containerd[1621]: time="2025-11-24T06:53:45.510588438Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Nov 24 06:53:45.511233 containerd[1621]: time="2025-11-24T06:53:45.510653357Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Nov 24 06:53:45.511233 containerd[1621]: time="2025-11-24T06:53:45.511209244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Nov 24 06:53:45.511233 containerd[1621]: time="2025-11-24T06:53:45.511218666Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Nov 24 06:53:45.511233 containerd[1621]: time="2025-11-24T06:53:45.511225479Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511233396Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511274344Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511289655Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511305081Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511313733Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511321709Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511347440Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511360966Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511366764Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Nov 24 06:53:45.511371 containerd[1621]: time="2025-11-24T06:53:45.511372063Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Nov 24 06:53:45.511647 containerd[1621]: time="2025-11-24T06:53:45.511376371Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Nov 24 06:53:45.511647 containerd[1621]: time="2025-11-24T06:53:45.511497407Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Nov 24 06:53:45.511647 containerd[1621]: time="2025-11-24T06:53:45.511511537Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Nov 24 06:53:45.511647 containerd[1621]: time="2025-11-24T06:53:45.511521634Z" level=info msg="runtime interface created" Nov 24 06:53:45.511647 containerd[1621]: time="2025-11-24T06:53:45.511524652Z" level=info msg="created NRI interface" Nov 24 06:53:45.511647 containerd[1621]: time="2025-11-24T06:53:45.511530169Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Nov 24 06:53:45.511647 containerd[1621]: time="2025-11-24T06:53:45.511550534Z" level=info msg="Connect containerd service" Nov 24 06:53:45.511647 containerd[1621]: time="2025-11-24T06:53:45.511573073Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Nov 24 06:53:45.512293 containerd[1621]: time="2025-11-24T06:53:45.512276337Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Nov 24 06:53:45.526283 systemd[1]: issuegen.service: Deactivated successfully. Nov 24 06:53:45.526459 systemd[1]: Finished issuegen.service - Generate /run/issue. Nov 24 06:53:45.536192 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Nov 24 06:53:45.576934 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Nov 24 06:53:45.579910 systemd[1]: Started getty@tty1.service - Getty on tty1. Nov 24 06:53:45.581531 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Nov 24 06:53:45.581738 systemd[1]: Reached target getty.target - Login Prompts. Nov 24 06:53:45.581924 locksmithd[1660]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Nov 24 06:53:45.647328 containerd[1621]: time="2025-11-24T06:53:45.647191837Z" level=info msg="Start subscribing containerd event" Nov 24 06:53:45.647328 containerd[1621]: time="2025-11-24T06:53:45.647224732Z" level=info msg="Start recovering state" Nov 24 06:53:45.647328 containerd[1621]: time="2025-11-24T06:53:45.647312519Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Nov 24 06:53:45.647416 containerd[1621]: time="2025-11-24T06:53:45.647353260Z" level=info msg=serving... address=/run/containerd/containerd.sock Nov 24 06:53:45.647637 containerd[1621]: time="2025-11-24T06:53:45.647554763Z" level=info msg="Start event monitor" Nov 24 06:53:45.647637 containerd[1621]: time="2025-11-24T06:53:45.647566722Z" level=info msg="Start cni network conf syncer for default" Nov 24 06:53:45.647637 containerd[1621]: time="2025-11-24T06:53:45.647571267Z" level=info msg="Start streaming server" Nov 24 06:53:45.647637 containerd[1621]: time="2025-11-24T06:53:45.647576889Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Nov 24 06:53:45.647637 containerd[1621]: time="2025-11-24T06:53:45.647581109Z" level=info msg="runtime interface starting up..." Nov 24 06:53:45.648264 containerd[1621]: time="2025-11-24T06:53:45.648239259Z" level=info msg="starting plugins..." Nov 24 06:53:45.648420 containerd[1621]: time="2025-11-24T06:53:45.648335677Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Nov 24 06:53:45.648855 containerd[1621]: time="2025-11-24T06:53:45.648734687Z" level=info msg="containerd successfully booted in 0.178559s" Nov 24 06:53:45.648743 systemd[1]: Started containerd.service - containerd container runtime. Nov 24 06:53:45.705396 tar[1609]: linux-amd64/README.md Nov 24 06:53:45.720575 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Nov 24 06:53:47.035832 systemd-networkd[1502]: ens192: Gained IPv6LL Nov 24 06:53:47.036213 systemd-timesyncd[1520]: Network configuration changed, trying to establish connection. Nov 24 06:53:47.037616 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Nov 24 06:53:47.038155 systemd[1]: Reached target network-online.target - Network is Online. Nov 24 06:53:47.039425 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Nov 24 06:53:47.049900 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:53:47.052818 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Nov 24 06:53:47.092479 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Nov 24 06:53:47.092977 systemd[1]: coreos-metadata.service: Deactivated successfully. Nov 24 06:53:47.093119 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Nov 24 06:53:47.093704 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Nov 24 06:53:48.826962 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:53:48.827557 systemd[1]: Reached target multi-user.target - Multi-User System. Nov 24 06:53:48.827982 systemd[1]: Startup finished in 2.571s (kernel) + 4.874s (initrd) + 5.412s (userspace) = 12.859s. Nov 24 06:53:48.834975 (kubelet)[1786]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:53:48.877303 login[1709]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Nov 24 06:53:48.879337 login[1710]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Nov 24 06:53:48.885402 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Nov 24 06:53:48.886186 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Nov 24 06:53:48.891929 systemd-logind[1591]: New session 1 of user core. Nov 24 06:53:48.894547 systemd-logind[1591]: New session 2 of user core. Nov 24 06:53:48.904345 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Nov 24 06:53:48.906178 systemd[1]: Starting user@500.service - User Manager for UID 500... Nov 24 06:53:48.924326 (systemd)[1793]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Nov 24 06:53:48.925632 systemd-logind[1591]: New session c1 of user core. Nov 24 06:53:48.943967 systemd-timesyncd[1520]: Network configuration changed, trying to establish connection. Nov 24 06:53:49.021100 systemd[1793]: Queued start job for default target default.target. Nov 24 06:53:49.028630 systemd[1793]: Created slice app.slice - User Application Slice. Nov 24 06:53:49.028667 systemd[1793]: Reached target paths.target - Paths. Nov 24 06:53:49.028713 systemd[1793]: Reached target timers.target - Timers. Nov 24 06:53:49.029673 systemd[1793]: Starting dbus.socket - D-Bus User Message Bus Socket... Nov 24 06:53:49.036888 systemd[1793]: Listening on dbus.socket - D-Bus User Message Bus Socket. Nov 24 06:53:49.036925 systemd[1793]: Reached target sockets.target - Sockets. Nov 24 06:53:49.036953 systemd[1793]: Reached target basic.target - Basic System. Nov 24 06:53:49.036982 systemd[1793]: Reached target default.target - Main User Target. Nov 24 06:53:49.037000 systemd[1793]: Startup finished in 107ms. Nov 24 06:53:49.037299 systemd[1]: Started user@500.service - User Manager for UID 500. Nov 24 06:53:49.039076 systemd[1]: Started session-1.scope - Session 1 of User core. Nov 24 06:53:49.040615 systemd[1]: Started session-2.scope - Session 2 of User core. Nov 24 06:53:49.688744 kubelet[1786]: E1124 06:53:49.688699 1786 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:53:49.690756 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:53:49.690869 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:53:49.691258 systemd[1]: kubelet.service: Consumed 747ms CPU time, 266.5M memory peak. Nov 24 06:53:59.941233 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Nov 24 06:53:59.942305 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:54:00.186470 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:54:00.188839 (kubelet)[1836]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:54:00.242577 kubelet[1836]: E1124 06:54:00.242511 1836 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:54:00.245287 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:54:00.245428 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:54:00.245757 systemd[1]: kubelet.service: Consumed 92ms CPU time, 108.9M memory peak. Nov 24 06:54:10.495743 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Nov 24 06:54:10.497317 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:54:10.849199 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:54:10.861058 (kubelet)[1851]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:54:10.916473 kubelet[1851]: E1124 06:54:10.916439 1851 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:54:10.918061 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:54:10.918150 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:54:10.918352 systemd[1]: kubelet.service: Consumed 118ms CPU time, 110.7M memory peak. Nov 24 06:54:15.518028 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Nov 24 06:54:15.519518 systemd[1]: Started sshd@0-139.178.70.102:22-147.75.109.163:45428.service - OpenSSH per-connection server daemon (147.75.109.163:45428). Nov 24 06:54:15.580167 sshd[1859]: Accepted publickey for core from 147.75.109.163 port 45428 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:54:15.581067 sshd-session[1859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:54:15.584072 systemd-logind[1591]: New session 3 of user core. Nov 24 06:54:15.591760 systemd[1]: Started session-3.scope - Session 3 of User core. Nov 24 06:54:15.646815 systemd[1]: Started sshd@1-139.178.70.102:22-147.75.109.163:45432.service - OpenSSH per-connection server daemon (147.75.109.163:45432). Nov 24 06:54:15.690560 sshd[1865]: Accepted publickey for core from 147.75.109.163 port 45432 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:54:15.691240 sshd-session[1865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:54:15.694279 systemd-logind[1591]: New session 4 of user core. Nov 24 06:54:15.702772 systemd[1]: Started session-4.scope - Session 4 of User core. Nov 24 06:54:15.751368 sshd[1868]: Connection closed by 147.75.109.163 port 45432 Nov 24 06:54:15.751895 sshd-session[1865]: pam_unix(sshd:session): session closed for user core Nov 24 06:54:15.761140 systemd[1]: sshd@1-139.178.70.102:22-147.75.109.163:45432.service: Deactivated successfully. Nov 24 06:54:15.762317 systemd[1]: session-4.scope: Deactivated successfully. Nov 24 06:54:15.763007 systemd-logind[1591]: Session 4 logged out. Waiting for processes to exit. Nov 24 06:54:15.765010 systemd[1]: Started sshd@2-139.178.70.102:22-147.75.109.163:45438.service - OpenSSH per-connection server daemon (147.75.109.163:45438). Nov 24 06:54:15.765733 systemd-logind[1591]: Removed session 4. Nov 24 06:54:15.806459 sshd[1874]: Accepted publickey for core from 147.75.109.163 port 45438 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:54:15.807730 sshd-session[1874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:54:15.811075 systemd-logind[1591]: New session 5 of user core. Nov 24 06:54:15.817815 systemd[1]: Started session-5.scope - Session 5 of User core. Nov 24 06:54:15.865168 sshd[1877]: Connection closed by 147.75.109.163 port 45438 Nov 24 06:54:15.866044 sshd-session[1874]: pam_unix(sshd:session): session closed for user core Nov 24 06:54:15.878994 systemd[1]: sshd@2-139.178.70.102:22-147.75.109.163:45438.service: Deactivated successfully. Nov 24 06:54:15.879928 systemd[1]: session-5.scope: Deactivated successfully. Nov 24 06:54:15.880438 systemd-logind[1591]: Session 5 logged out. Waiting for processes to exit. Nov 24 06:54:15.881861 systemd[1]: Started sshd@3-139.178.70.102:22-147.75.109.163:45446.service - OpenSSH per-connection server daemon (147.75.109.163:45446). Nov 24 06:54:15.882525 systemd-logind[1591]: Removed session 5. Nov 24 06:54:15.926291 sshd[1883]: Accepted publickey for core from 147.75.109.163 port 45446 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:54:15.927046 sshd-session[1883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:54:15.929634 systemd-logind[1591]: New session 6 of user core. Nov 24 06:54:15.940793 systemd[1]: Started session-6.scope - Session 6 of User core. Nov 24 06:54:15.989277 sshd[1886]: Connection closed by 147.75.109.163 port 45446 Nov 24 06:54:15.989711 sshd-session[1883]: pam_unix(sshd:session): session closed for user core Nov 24 06:54:15.997173 systemd[1]: sshd@3-139.178.70.102:22-147.75.109.163:45446.service: Deactivated successfully. Nov 24 06:54:15.998298 systemd[1]: session-6.scope: Deactivated successfully. Nov 24 06:54:15.998919 systemd-logind[1591]: Session 6 logged out. Waiting for processes to exit. Nov 24 06:54:16.000262 systemd[1]: Started sshd@4-139.178.70.102:22-147.75.109.163:45458.service - OpenSSH per-connection server daemon (147.75.109.163:45458). Nov 24 06:54:16.002171 systemd-logind[1591]: Removed session 6. Nov 24 06:54:16.040986 sshd[1892]: Accepted publickey for core from 147.75.109.163 port 45458 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:54:16.041938 sshd-session[1892]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:54:16.045712 systemd-logind[1591]: New session 7 of user core. Nov 24 06:54:16.054854 systemd[1]: Started session-7.scope - Session 7 of User core. Nov 24 06:54:16.113588 sudo[1896]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Nov 24 06:54:16.113793 sudo[1896]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Nov 24 06:54:16.123952 sudo[1896]: pam_unix(sudo:session): session closed for user root Nov 24 06:54:16.124674 sshd[1895]: Connection closed by 147.75.109.163 port 45458 Nov 24 06:54:16.124941 sshd-session[1892]: pam_unix(sshd:session): session closed for user core Nov 24 06:54:16.130791 systemd[1]: sshd@4-139.178.70.102:22-147.75.109.163:45458.service: Deactivated successfully. Nov 24 06:54:16.131699 systemd[1]: session-7.scope: Deactivated successfully. Nov 24 06:54:16.132189 systemd-logind[1591]: Session 7 logged out. Waiting for processes to exit. Nov 24 06:54:16.133762 systemd[1]: Started sshd@5-139.178.70.102:22-147.75.109.163:45470.service - OpenSSH per-connection server daemon (147.75.109.163:45470). Nov 24 06:54:16.134362 systemd-logind[1591]: Removed session 7. Nov 24 06:54:16.169388 sshd[1902]: Accepted publickey for core from 147.75.109.163 port 45470 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:54:16.170043 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:54:16.172422 systemd-logind[1591]: New session 8 of user core. Nov 24 06:54:16.180997 systemd[1]: Started session-8.scope - Session 8 of User core. Nov 24 06:54:16.229222 sudo[1907]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Nov 24 06:54:16.229649 sudo[1907]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Nov 24 06:54:16.246816 sudo[1907]: pam_unix(sudo:session): session closed for user root Nov 24 06:54:16.250700 sudo[1906]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Nov 24 06:54:16.250896 sudo[1906]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Nov 24 06:54:16.258328 systemd[1]: Starting audit-rules.service - Load Audit Rules... Nov 24 06:54:16.290503 augenrules[1929]: No rules Nov 24 06:54:16.291129 systemd[1]: audit-rules.service: Deactivated successfully. Nov 24 06:54:16.291348 systemd[1]: Finished audit-rules.service - Load Audit Rules. Nov 24 06:54:16.291901 sudo[1906]: pam_unix(sudo:session): session closed for user root Nov 24 06:54:16.292786 sshd[1905]: Connection closed by 147.75.109.163 port 45470 Nov 24 06:54:16.293738 sshd-session[1902]: pam_unix(sshd:session): session closed for user core Nov 24 06:54:16.297596 systemd[1]: sshd@5-139.178.70.102:22-147.75.109.163:45470.service: Deactivated successfully. Nov 24 06:54:16.298517 systemd[1]: session-8.scope: Deactivated successfully. Nov 24 06:54:16.299252 systemd-logind[1591]: Session 8 logged out. Waiting for processes to exit. Nov 24 06:54:16.300189 systemd[1]: Started sshd@6-139.178.70.102:22-147.75.109.163:45480.service - OpenSSH per-connection server daemon (147.75.109.163:45480). Nov 24 06:54:16.301870 systemd-logind[1591]: Removed session 8. Nov 24 06:54:16.341778 sshd[1938]: Accepted publickey for core from 147.75.109.163 port 45480 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:54:16.342628 sshd-session[1938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:54:16.345780 systemd-logind[1591]: New session 9 of user core. Nov 24 06:54:16.353753 systemd[1]: Started session-9.scope - Session 9 of User core. Nov 24 06:54:16.403294 sudo[1942]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Nov 24 06:54:16.404139 sudo[1942]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Nov 24 06:54:16.807871 systemd[1]: Starting docker.service - Docker Application Container Engine... Nov 24 06:54:16.815848 (dockerd)[1961]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Nov 24 06:54:17.029003 dockerd[1961]: time="2025-11-24T06:54:17.028842215Z" level=info msg="Starting up" Nov 24 06:54:17.029672 dockerd[1961]: time="2025-11-24T06:54:17.029619072Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Nov 24 06:54:17.037398 dockerd[1961]: time="2025-11-24T06:54:17.037373811Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Nov 24 06:54:17.044890 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport73913022-merged.mount: Deactivated successfully. Nov 24 06:54:17.062830 dockerd[1961]: time="2025-11-24T06:54:17.062523736Z" level=info msg="Loading containers: start." Nov 24 06:54:17.070674 kernel: Initializing XFRM netlink socket Nov 24 06:54:17.200881 systemd-timesyncd[1520]: Network configuration changed, trying to establish connection. Nov 24 06:54:17.225376 systemd-networkd[1502]: docker0: Link UP Nov 24 06:54:17.226907 dockerd[1961]: time="2025-11-24T06:54:17.226870552Z" level=info msg="Loading containers: done." Nov 24 06:54:17.235430 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1060114975-merged.mount: Deactivated successfully. Nov 24 06:54:17.238082 dockerd[1961]: time="2025-11-24T06:54:17.237881417Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Nov 24 06:54:17.238082 dockerd[1961]: time="2025-11-24T06:54:17.237931739Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Nov 24 06:54:17.238082 dockerd[1961]: time="2025-11-24T06:54:17.237972254Z" level=info msg="Initializing buildkit" Nov 24 06:54:17.247782 dockerd[1961]: time="2025-11-24T06:54:17.247756311Z" level=info msg="Completed buildkit initialization" Nov 24 06:54:17.252513 dockerd[1961]: time="2025-11-24T06:54:17.252481017Z" level=info msg="Daemon has completed initialization" Nov 24 06:54:17.253084 dockerd[1961]: time="2025-11-24T06:54:17.252573709Z" level=info msg="API listen on /run/docker.sock" Nov 24 06:54:17.252686 systemd[1]: Started docker.service - Docker Application Container Engine. Nov 24 06:55:58.195297 systemd-resolved[1503]: Clock change detected. Flushing caches. Nov 24 06:55:58.195453 systemd-timesyncd[1520]: Contacted time server 45.84.199.136:123 (2.flatcar.pool.ntp.org). Nov 24 06:55:58.195489 systemd-timesyncd[1520]: Initial clock synchronization to Mon 2025-11-24 06:55:58.195209 UTC. Nov 24 06:55:58.830221 containerd[1621]: time="2025-11-24T06:55:58.830189615Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.6\"" Nov 24 06:55:59.577124 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1282012669.mount: Deactivated successfully. Nov 24 06:56:00.909203 containerd[1621]: time="2025-11-24T06:56:00.908664867Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:00.909819 containerd[1621]: time="2025-11-24T06:56:00.909789498Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.6: active requests=0, bytes read=30113213" Nov 24 06:56:00.910105 containerd[1621]: time="2025-11-24T06:56:00.910092199Z" level=info msg="ImageCreate event name:\"sha256:74cc54db7bbcced6056c8430786ff02557adfb2ad9e548fa2ae02ff4a3b42c73\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:00.912124 containerd[1621]: time="2025-11-24T06:56:00.912108237Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:7c1fe7a61835371b6f42e1acbd87ecc4c456930785ae652e3ce7bcecf8cd4d9c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:00.912516 containerd[1621]: time="2025-11-24T06:56:00.912433928Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.6\" with image id \"sha256:74cc54db7bbcced6056c8430786ff02557adfb2ad9e548fa2ae02ff4a3b42c73\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.6\", repo digest \"registry.k8s.io/kube-apiserver@sha256:7c1fe7a61835371b6f42e1acbd87ecc4c456930785ae652e3ce7bcecf8cd4d9c\", size \"30109812\" in 2.082219815s" Nov 24 06:56:00.912566 containerd[1621]: time="2025-11-24T06:56:00.912557710Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.6\" returns image reference \"sha256:74cc54db7bbcced6056c8430786ff02557adfb2ad9e548fa2ae02ff4a3b42c73\"" Nov 24 06:56:00.913066 containerd[1621]: time="2025-11-24T06:56:00.913056018Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.6\"" Nov 24 06:56:01.969469 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Nov 24 06:56:01.971487 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:56:02.344798 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:56:02.350856 (kubelet)[2243]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:56:02.374063 kubelet[2243]: E1124 06:56:02.374029 2243 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:56:02.375497 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:56:02.375578 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:56:02.375798 systemd[1]: kubelet.service: Consumed 100ms CPU time, 110M memory peak. Nov 24 06:56:02.433181 containerd[1621]: time="2025-11-24T06:56:02.432685359Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:02.448424 containerd[1621]: time="2025-11-24T06:56:02.448398026Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.6: active requests=0, bytes read=26018107" Nov 24 06:56:02.450776 containerd[1621]: time="2025-11-24T06:56:02.450757916Z" level=info msg="ImageCreate event name:\"sha256:9290eb63dc141c2f8d019c41484908f600f19daccfbc45c0a856b067ca47b0af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:02.454567 containerd[1621]: time="2025-11-24T06:56:02.454545520Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:fb1f45370081166f032a2ed3d41deaccc6bb277b4d9841d4aaebad7aada930c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:02.456681 containerd[1621]: time="2025-11-24T06:56:02.456659798Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.6\" with image id \"sha256:9290eb63dc141c2f8d019c41484908f600f19daccfbc45c0a856b067ca47b0af\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.6\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:fb1f45370081166f032a2ed3d41deaccc6bb277b4d9841d4aaebad7aada930c5\", size \"27675143\" in 1.543273914s" Nov 24 06:56:02.456956 containerd[1621]: time="2025-11-24T06:56:02.456942898Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.6\" returns image reference \"sha256:9290eb63dc141c2f8d019c41484908f600f19daccfbc45c0a856b067ca47b0af\"" Nov 24 06:56:02.457594 containerd[1621]: time="2025-11-24T06:56:02.457573977Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.6\"" Nov 24 06:56:04.245943 containerd[1621]: time="2025-11-24T06:56:04.245343689Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:04.245943 containerd[1621]: time="2025-11-24T06:56:04.245764248Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.6: active requests=0, bytes read=20156482" Nov 24 06:56:04.245943 containerd[1621]: time="2025-11-24T06:56:04.245914541Z" level=info msg="ImageCreate event name:\"sha256:6109fc16b0291b0728bc133620fe1906c51d999917dd3add0744a906c0fb7eef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:04.247565 containerd[1621]: time="2025-11-24T06:56:04.247551481Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:02bfac33158a2323cd2d4ba729cb9d7be695b172be21dfd3740e4a608d39a378\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:04.248287 containerd[1621]: time="2025-11-24T06:56:04.248269615Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.6\" with image id \"sha256:6109fc16b0291b0728bc133620fe1906c51d999917dd3add0744a906c0fb7eef\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.6\", repo digest \"registry.k8s.io/kube-scheduler@sha256:02bfac33158a2323cd2d4ba729cb9d7be695b172be21dfd3740e4a608d39a378\", size \"21813536\" in 1.790672961s" Nov 24 06:56:04.248344 containerd[1621]: time="2025-11-24T06:56:04.248335264Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.6\" returns image reference \"sha256:6109fc16b0291b0728bc133620fe1906c51d999917dd3add0744a906c0fb7eef\"" Nov 24 06:56:04.248738 containerd[1621]: time="2025-11-24T06:56:04.248632961Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.6\"" Nov 24 06:56:05.472756 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount477735293.mount: Deactivated successfully. Nov 24 06:56:06.005647 containerd[1621]: time="2025-11-24T06:56:06.005417606Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:06.012824 containerd[1621]: time="2025-11-24T06:56:06.012796746Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.6: active requests=0, bytes read=31929138" Nov 24 06:56:06.026592 containerd[1621]: time="2025-11-24T06:56:06.026558634Z" level=info msg="ImageCreate event name:\"sha256:87c5a2e6c1d1ea6f96a0b5d43f96c5066e8ff78c9c6adb335631fc9c90cb0a19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:06.039809 containerd[1621]: time="2025-11-24T06:56:06.039776733Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:9119bd7ae5249b9d8bdd14a7719a0ebf744de112fe618008adca3094a12b67fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:06.040304 containerd[1621]: time="2025-11-24T06:56:06.040150729Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.6\" with image id \"sha256:87c5a2e6c1d1ea6f96a0b5d43f96c5066e8ff78c9c6adb335631fc9c90cb0a19\", repo tag \"registry.k8s.io/kube-proxy:v1.33.6\", repo digest \"registry.k8s.io/kube-proxy@sha256:9119bd7ae5249b9d8bdd14a7719a0ebf744de112fe618008adca3094a12b67fc\", size \"31928157\" in 1.791342913s" Nov 24 06:56:06.040304 containerd[1621]: time="2025-11-24T06:56:06.040174510Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.6\" returns image reference \"sha256:87c5a2e6c1d1ea6f96a0b5d43f96c5066e8ff78c9c6adb335631fc9c90cb0a19\"" Nov 24 06:56:06.040560 containerd[1621]: time="2025-11-24T06:56:06.040443794Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Nov 24 06:56:06.606892 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount853011441.mount: Deactivated successfully. Nov 24 06:56:07.537105 containerd[1621]: time="2025-11-24T06:56:07.537062314Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:07.542800 containerd[1621]: time="2025-11-24T06:56:07.542767918Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Nov 24 06:56:07.549950 containerd[1621]: time="2025-11-24T06:56:07.549914537Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:07.555729 containerd[1621]: time="2025-11-24T06:56:07.555686030Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:07.556721 containerd[1621]: time="2025-11-24T06:56:07.556401457Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.515939592s" Nov 24 06:56:07.556721 containerd[1621]: time="2025-11-24T06:56:07.556425278Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Nov 24 06:56:07.557120 containerd[1621]: time="2025-11-24T06:56:07.557096801Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Nov 24 06:56:08.157648 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount726730206.mount: Deactivated successfully. Nov 24 06:56:08.195644 containerd[1621]: time="2025-11-24T06:56:08.195589165Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Nov 24 06:56:08.200254 containerd[1621]: time="2025-11-24T06:56:08.200230766Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Nov 24 06:56:08.207426 containerd[1621]: time="2025-11-24T06:56:08.207402001Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Nov 24 06:56:08.210065 containerd[1621]: time="2025-11-24T06:56:08.210042091Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Nov 24 06:56:08.210671 containerd[1621]: time="2025-11-24T06:56:08.210564439Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 653.445922ms" Nov 24 06:56:08.210671 containerd[1621]: time="2025-11-24T06:56:08.210586363Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Nov 24 06:56:08.211061 containerd[1621]: time="2025-11-24T06:56:08.210919213Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Nov 24 06:56:08.924785 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1164230867.mount: Deactivated successfully. Nov 24 06:56:10.780706 update_engine[1595]: I20251124 06:56:10.780655 1595 update_attempter.cc:509] Updating boot flags... Nov 24 06:56:12.625887 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Nov 24 06:56:12.627004 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:56:13.048081 containerd[1621]: time="2025-11-24T06:56:13.048002403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:13.172536 containerd[1621]: time="2025-11-24T06:56:13.172500002Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58926227" Nov 24 06:56:13.172770 containerd[1621]: time="2025-11-24T06:56:13.172754876Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:13.175152 containerd[1621]: time="2025-11-24T06:56:13.175127783Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:13.175685 containerd[1621]: time="2025-11-24T06:56:13.175577414Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 4.964642067s" Nov 24 06:56:13.175685 containerd[1621]: time="2025-11-24T06:56:13.175594881Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Nov 24 06:56:13.185818 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:56:13.197927 (kubelet)[2406]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Nov 24 06:56:13.242193 kubelet[2406]: E1124 06:56:13.242162 2406 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Nov 24 06:56:13.243699 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Nov 24 06:56:13.243843 systemd[1]: kubelet.service: Failed with result 'exit-code'. Nov 24 06:56:13.244218 systemd[1]: kubelet.service: Consumed 94ms CPU time, 109M memory peak. Nov 24 06:56:15.111116 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:56:15.111452 systemd[1]: kubelet.service: Consumed 94ms CPU time, 109M memory peak. Nov 24 06:56:15.113523 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:56:15.135089 systemd[1]: Reload requested from client PID 2435 ('systemctl') (unit session-9.scope)... Nov 24 06:56:15.135182 systemd[1]: Reloading... Nov 24 06:56:15.210640 zram_generator::config[2482]: No configuration found. Nov 24 06:56:15.279499 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:56:15.347280 systemd[1]: Reloading finished in 211 ms. Nov 24 06:56:15.367701 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Nov 24 06:56:15.367756 systemd[1]: kubelet.service: Failed with result 'signal'. Nov 24 06:56:15.367938 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:56:15.368002 systemd[1]: kubelet.service: Consumed 53ms CPU time, 87.6M memory peak. Nov 24 06:56:15.369524 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:56:15.648023 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:56:15.655867 (kubelet)[2546]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Nov 24 06:56:15.696413 kubelet[2546]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 06:56:15.696413 kubelet[2546]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Nov 24 06:56:15.696413 kubelet[2546]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 06:56:15.710883 kubelet[2546]: I1124 06:56:15.710839 2546 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 24 06:56:16.215636 kubelet[2546]: I1124 06:56:16.215520 2546 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Nov 24 06:56:16.215636 kubelet[2546]: I1124 06:56:16.215539 2546 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 24 06:56:16.215956 kubelet[2546]: I1124 06:56:16.215946 2546 server.go:956] "Client rotation is on, will bootstrap in background" Nov 24 06:56:16.246459 kubelet[2546]: I1124 06:56:16.246255 2546 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Nov 24 06:56:16.247289 kubelet[2546]: E1124 06:56:16.247239 2546 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://139.178.70.102:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Nov 24 06:56:16.263229 kubelet[2546]: I1124 06:56:16.263214 2546 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 24 06:56:16.271859 kubelet[2546]: I1124 06:56:16.271843 2546 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Nov 24 06:56:16.274098 kubelet[2546]: I1124 06:56:16.274072 2546 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 24 06:56:16.276406 kubelet[2546]: I1124 06:56:16.274096 2546 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 24 06:56:16.277117 kubelet[2546]: I1124 06:56:16.277103 2546 topology_manager.go:138] "Creating topology manager with none policy" Nov 24 06:56:16.277117 kubelet[2546]: I1124 06:56:16.277116 2546 container_manager_linux.go:303] "Creating device plugin manager" Nov 24 06:56:16.277204 kubelet[2546]: I1124 06:56:16.277193 2546 state_mem.go:36] "Initialized new in-memory state store" Nov 24 06:56:16.279288 kubelet[2546]: I1124 06:56:16.279275 2546 kubelet.go:480] "Attempting to sync node with API server" Nov 24 06:56:16.279288 kubelet[2546]: I1124 06:56:16.279289 2546 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 24 06:56:16.280354 kubelet[2546]: I1124 06:56:16.280340 2546 kubelet.go:386] "Adding apiserver pod source" Nov 24 06:56:16.281896 kubelet[2546]: I1124 06:56:16.281613 2546 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 24 06:56:16.286087 kubelet[2546]: I1124 06:56:16.286074 2546 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Nov 24 06:56:16.287479 kubelet[2546]: I1124 06:56:16.287470 2546 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Nov 24 06:56:16.288041 kubelet[2546]: W1124 06:56:16.288033 2546 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Nov 24 06:56:16.291605 kubelet[2546]: E1124 06:56:16.291333 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Nov 24 06:56:16.294790 kubelet[2546]: I1124 06:56:16.294781 2546 watchdog_linux.go:99] "Systemd watchdog is not enabled" Nov 24 06:56:16.294857 kubelet[2546]: I1124 06:56:16.294852 2546 server.go:1289] "Started kubelet" Nov 24 06:56:16.295819 kubelet[2546]: E1124 06:56:16.295795 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Nov 24 06:56:16.297228 kubelet[2546]: I1124 06:56:16.297209 2546 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Nov 24 06:56:16.300571 kubelet[2546]: I1124 06:56:16.300563 2546 server.go:317] "Adding debug handlers to kubelet server" Nov 24 06:56:16.302809 kubelet[2546]: I1124 06:56:16.302796 2546 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 24 06:56:16.308951 kubelet[2546]: E1124 06:56:16.305516 2546 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.102:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.102:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.187adefb9b6e6bd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-11-24 06:56:16.294833113 +0000 UTC m=+0.636193766,LastTimestamp:2025-11-24 06:56:16.294833113 +0000 UTC m=+0.636193766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Nov 24 06:56:16.310091 kubelet[2546]: I1124 06:56:16.309047 2546 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 24 06:56:16.310091 kubelet[2546]: I1124 06:56:16.309157 2546 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 24 06:56:16.310091 kubelet[2546]: I1124 06:56:16.309261 2546 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Nov 24 06:56:16.310763 kubelet[2546]: I1124 06:56:16.310739 2546 volume_manager.go:297] "Starting Kubelet Volume Manager" Nov 24 06:56:16.311396 kubelet[2546]: E1124 06:56:16.311016 2546 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Nov 24 06:56:16.313215 kubelet[2546]: I1124 06:56:16.312888 2546 factory.go:223] Registration of the systemd container factory successfully Nov 24 06:56:16.313215 kubelet[2546]: I1124 06:56:16.312945 2546 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Nov 24 06:56:16.313870 kubelet[2546]: E1124 06:56:16.313846 2546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="200ms" Nov 24 06:56:16.314052 kubelet[2546]: I1124 06:56:16.314046 2546 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Nov 24 06:56:16.315544 kubelet[2546]: I1124 06:56:16.315531 2546 reconciler.go:26] "Reconciler: start to sync state" Nov 24 06:56:16.318043 kubelet[2546]: E1124 06:56:16.317168 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Nov 24 06:56:16.318043 kubelet[2546]: I1124 06:56:16.317278 2546 factory.go:223] Registration of the containerd container factory successfully Nov 24 06:56:16.321820 kubelet[2546]: I1124 06:56:16.321796 2546 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Nov 24 06:56:16.322579 kubelet[2546]: I1124 06:56:16.322565 2546 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Nov 24 06:56:16.322579 kubelet[2546]: I1124 06:56:16.322580 2546 status_manager.go:230] "Starting to sync pod status with apiserver" Nov 24 06:56:16.322640 kubelet[2546]: I1124 06:56:16.322591 2546 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Nov 24 06:56:16.322640 kubelet[2546]: I1124 06:56:16.322595 2546 kubelet.go:2436] "Starting kubelet main sync loop" Nov 24 06:56:16.322640 kubelet[2546]: E1124 06:56:16.322615 2546 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 24 06:56:16.327110 kubelet[2546]: E1124 06:56:16.327096 2546 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Nov 24 06:56:16.328746 kubelet[2546]: E1124 06:56:16.328731 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Nov 24 06:56:16.342270 kubelet[2546]: I1124 06:56:16.342237 2546 cpu_manager.go:221] "Starting CPU manager" policy="none" Nov 24 06:56:16.342270 kubelet[2546]: I1124 06:56:16.342265 2546 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Nov 24 06:56:16.342347 kubelet[2546]: I1124 06:56:16.342278 2546 state_mem.go:36] "Initialized new in-memory state store" Nov 24 06:56:16.345366 kubelet[2546]: I1124 06:56:16.345355 2546 policy_none.go:49] "None policy: Start" Nov 24 06:56:16.345366 kubelet[2546]: I1124 06:56:16.345366 2546 memory_manager.go:186] "Starting memorymanager" policy="None" Nov 24 06:56:16.345426 kubelet[2546]: I1124 06:56:16.345373 2546 state_mem.go:35] "Initializing new in-memory state store" Nov 24 06:56:16.350584 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Nov 24 06:56:16.361150 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Nov 24 06:56:16.363743 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Nov 24 06:56:16.370153 kubelet[2546]: E1124 06:56:16.370141 2546 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Nov 24 06:56:16.370322 kubelet[2546]: I1124 06:56:16.370315 2546 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 24 06:56:16.370375 kubelet[2546]: I1124 06:56:16.370358 2546 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 24 06:56:16.370572 kubelet[2546]: I1124 06:56:16.370565 2546 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 24 06:56:16.371416 kubelet[2546]: E1124 06:56:16.371407 2546 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Nov 24 06:56:16.371482 kubelet[2546]: E1124 06:56:16.371476 2546 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Nov 24 06:56:16.432205 systemd[1]: Created slice kubepods-burstable-pod1d5832191310254249cf17c2353d71ec.slice - libcontainer container kubepods-burstable-pod1d5832191310254249cf17c2353d71ec.slice. Nov 24 06:56:16.449202 kubelet[2546]: E1124 06:56:16.449093 2546 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:56:16.451090 systemd[1]: Created slice kubepods-burstable-pode51b49401d7e125d16957469facd7352.slice - libcontainer container kubepods-burstable-pode51b49401d7e125d16957469facd7352.slice. Nov 24 06:56:16.460612 kubelet[2546]: E1124 06:56:16.460567 2546 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:56:16.462993 systemd[1]: Created slice kubepods-burstable-pod19161de957280c578b5c0a74a63c4967.slice - libcontainer container kubepods-burstable-pod19161de957280c578b5c0a74a63c4967.slice. Nov 24 06:56:16.464363 kubelet[2546]: E1124 06:56:16.464352 2546 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:56:16.472221 kubelet[2546]: I1124 06:56:16.472172 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:56:16.472616 kubelet[2546]: E1124 06:56:16.472593 2546 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Nov 24 06:56:16.514579 kubelet[2546]: E1124 06:56:16.514549 2546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="400ms" Nov 24 06:56:16.516911 kubelet[2546]: I1124 06:56:16.516761 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/19161de957280c578b5c0a74a63c4967-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"19161de957280c578b5c0a74a63c4967\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:16.516911 kubelet[2546]: I1124 06:56:16.516784 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:16.516911 kubelet[2546]: I1124 06:56:16.516799 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/19161de957280c578b5c0a74a63c4967-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"19161de957280c578b5c0a74a63c4967\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:16.516911 kubelet[2546]: I1124 06:56:16.516810 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:16.516911 kubelet[2546]: I1124 06:56:16.516823 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:16.517060 kubelet[2546]: I1124 06:56:16.516834 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:16.517060 kubelet[2546]: I1124 06:56:16.516845 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:16.517060 kubelet[2546]: I1124 06:56:16.516855 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e51b49401d7e125d16957469facd7352-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"e51b49401d7e125d16957469facd7352\") " pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:16.517060 kubelet[2546]: I1124 06:56:16.516868 2546 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/19161de957280c578b5c0a74a63c4967-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"19161de957280c578b5c0a74a63c4967\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:16.673873 kubelet[2546]: I1124 06:56:16.673845 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:56:16.674108 kubelet[2546]: E1124 06:56:16.674088 2546 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Nov 24 06:56:16.750816 containerd[1621]: time="2025-11-24T06:56:16.750492543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1d5832191310254249cf17c2353d71ec,Namespace:kube-system,Attempt:0,}" Nov 24 06:56:16.764651 containerd[1621]: time="2025-11-24T06:56:16.764594603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:e51b49401d7e125d16957469facd7352,Namespace:kube-system,Attempt:0,}" Nov 24 06:56:16.766391 containerd[1621]: time="2025-11-24T06:56:16.766303992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:19161de957280c578b5c0a74a63c4967,Namespace:kube-system,Attempt:0,}" Nov 24 06:56:16.864308 containerd[1621]: time="2025-11-24T06:56:16.864282879Z" level=info msg="connecting to shim 5c77ad222d292a0514664de039ad2e1c62976b011a13547d4831293acc14083a" address="unix:///run/containerd/s/821625c398d9f6da85d7ccb5de2b4e3dc27ae852f908e8bcb100fbad0c7208e7" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:56:16.865136 containerd[1621]: time="2025-11-24T06:56:16.865124012Z" level=info msg="connecting to shim a3d0bfd9bcc0c176b4b7ddc0c5d87241cb5c8ef92d76c313448a85f80cbfb293" address="unix:///run/containerd/s/d69c9c9086d909080690fb6cd9bddc7ab85234d9a9c918168fee2a2fc8608f41" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:56:16.879026 containerd[1621]: time="2025-11-24T06:56:16.879000106Z" level=info msg="connecting to shim b930a9d9ecdd385d8d6fc1ab3dfc9f30362904ec21a92017e095f4470b2babf8" address="unix:///run/containerd/s/2da506c6de2547b5a026d8343b7c1ccb890169e1983fb803de08afbb5e16d544" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:56:16.914068 kubelet[2546]: E1124 06:56:16.913996 2546 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.102:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.102:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.187adefb9b6e6bd9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-11-24 06:56:16.294833113 +0000 UTC m=+0.636193766,LastTimestamp:2025-11-24 06:56:16.294833113 +0000 UTC m=+0.636193766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Nov 24 06:56:16.915341 kubelet[2546]: E1124 06:56:16.915319 2546 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.102:6443: connect: connection refused" interval="800ms" Nov 24 06:56:16.970782 systemd[1]: Started cri-containerd-a3d0bfd9bcc0c176b4b7ddc0c5d87241cb5c8ef92d76c313448a85f80cbfb293.scope - libcontainer container a3d0bfd9bcc0c176b4b7ddc0c5d87241cb5c8ef92d76c313448a85f80cbfb293. Nov 24 06:56:16.975065 systemd[1]: Started cri-containerd-5c77ad222d292a0514664de039ad2e1c62976b011a13547d4831293acc14083a.scope - libcontainer container 5c77ad222d292a0514664de039ad2e1c62976b011a13547d4831293acc14083a. Nov 24 06:56:16.980205 systemd[1]: Started cri-containerd-b930a9d9ecdd385d8d6fc1ab3dfc9f30362904ec21a92017e095f4470b2babf8.scope - libcontainer container b930a9d9ecdd385d8d6fc1ab3dfc9f30362904ec21a92017e095f4470b2babf8. Nov 24 06:56:17.021781 containerd[1621]: time="2025-11-24T06:56:17.021333334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:1d5832191310254249cf17c2353d71ec,Namespace:kube-system,Attempt:0,} returns sandbox id \"b930a9d9ecdd385d8d6fc1ab3dfc9f30362904ec21a92017e095f4470b2babf8\"" Nov 24 06:56:17.025981 containerd[1621]: time="2025-11-24T06:56:17.025963660Z" level=info msg="CreateContainer within sandbox \"b930a9d9ecdd385d8d6fc1ab3dfc9f30362904ec21a92017e095f4470b2babf8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Nov 24 06:56:17.028926 containerd[1621]: time="2025-11-24T06:56:17.028903901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:19161de957280c578b5c0a74a63c4967,Namespace:kube-system,Attempt:0,} returns sandbox id \"a3d0bfd9bcc0c176b4b7ddc0c5d87241cb5c8ef92d76c313448a85f80cbfb293\"" Nov 24 06:56:17.030732 containerd[1621]: time="2025-11-24T06:56:17.030707218Z" level=info msg="CreateContainer within sandbox \"a3d0bfd9bcc0c176b4b7ddc0c5d87241cb5c8ef92d76c313448a85f80cbfb293\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Nov 24 06:56:17.033270 containerd[1621]: time="2025-11-24T06:56:17.033255602Z" level=info msg="Container b3090aa400b9170ff95755440465d7cac63c6b97d3c0cdca0c7ee5d288ee75e5: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:17.036265 containerd[1621]: time="2025-11-24T06:56:17.035912398Z" level=info msg="Container 5d3fd599c029d0272b272812670718ec12d4bbd9f9784ffc13f0140e6565cec4: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:17.038861 containerd[1621]: time="2025-11-24T06:56:17.038843230Z" level=info msg="CreateContainer within sandbox \"b930a9d9ecdd385d8d6fc1ab3dfc9f30362904ec21a92017e095f4470b2babf8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b3090aa400b9170ff95755440465d7cac63c6b97d3c0cdca0c7ee5d288ee75e5\"" Nov 24 06:56:17.040271 containerd[1621]: time="2025-11-24T06:56:17.040256543Z" level=info msg="StartContainer for \"b3090aa400b9170ff95755440465d7cac63c6b97d3c0cdca0c7ee5d288ee75e5\"" Nov 24 06:56:17.041438 containerd[1621]: time="2025-11-24T06:56:17.041425749Z" level=info msg="CreateContainer within sandbox \"a3d0bfd9bcc0c176b4b7ddc0c5d87241cb5c8ef92d76c313448a85f80cbfb293\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5d3fd599c029d0272b272812670718ec12d4bbd9f9784ffc13f0140e6565cec4\"" Nov 24 06:56:17.042268 containerd[1621]: time="2025-11-24T06:56:17.041754887Z" level=info msg="connecting to shim b3090aa400b9170ff95755440465d7cac63c6b97d3c0cdca0c7ee5d288ee75e5" address="unix:///run/containerd/s/2da506c6de2547b5a026d8343b7c1ccb890169e1983fb803de08afbb5e16d544" protocol=ttrpc version=3 Nov 24 06:56:17.043695 containerd[1621]: time="2025-11-24T06:56:17.043684589Z" level=info msg="StartContainer for \"5d3fd599c029d0272b272812670718ec12d4bbd9f9784ffc13f0140e6565cec4\"" Nov 24 06:56:17.044213 containerd[1621]: time="2025-11-24T06:56:17.044200823Z" level=info msg="connecting to shim 5d3fd599c029d0272b272812670718ec12d4bbd9f9784ffc13f0140e6565cec4" address="unix:///run/containerd/s/d69c9c9086d909080690fb6cd9bddc7ab85234d9a9c918168fee2a2fc8608f41" protocol=ttrpc version=3 Nov 24 06:56:17.055076 containerd[1621]: time="2025-11-24T06:56:17.055056427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:e51b49401d7e125d16957469facd7352,Namespace:kube-system,Attempt:0,} returns sandbox id \"5c77ad222d292a0514664de039ad2e1c62976b011a13547d4831293acc14083a\"" Nov 24 06:56:17.058308 containerd[1621]: time="2025-11-24T06:56:17.058295303Z" level=info msg="CreateContainer within sandbox \"5c77ad222d292a0514664de039ad2e1c62976b011a13547d4831293acc14083a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Nov 24 06:56:17.064702 containerd[1621]: time="2025-11-24T06:56:17.064687257Z" level=info msg="Container 28fa1d97571667f21a83d2d94b8d62468e2301cccfd0c0ac1f55a8c332f9a4af: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:17.066848 containerd[1621]: time="2025-11-24T06:56:17.066829498Z" level=info msg="CreateContainer within sandbox \"5c77ad222d292a0514664de039ad2e1c62976b011a13547d4831293acc14083a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"28fa1d97571667f21a83d2d94b8d62468e2301cccfd0c0ac1f55a8c332f9a4af\"" Nov 24 06:56:17.067609 containerd[1621]: time="2025-11-24T06:56:17.067068927Z" level=info msg="StartContainer for \"28fa1d97571667f21a83d2d94b8d62468e2301cccfd0c0ac1f55a8c332f9a4af\"" Nov 24 06:56:17.067609 containerd[1621]: time="2025-11-24T06:56:17.067576749Z" level=info msg="connecting to shim 28fa1d97571667f21a83d2d94b8d62468e2301cccfd0c0ac1f55a8c332f9a4af" address="unix:///run/containerd/s/821625c398d9f6da85d7ccb5de2b4e3dc27ae852f908e8bcb100fbad0c7208e7" protocol=ttrpc version=3 Nov 24 06:56:17.070859 systemd[1]: Started cri-containerd-b3090aa400b9170ff95755440465d7cac63c6b97d3c0cdca0c7ee5d288ee75e5.scope - libcontainer container b3090aa400b9170ff95755440465d7cac63c6b97d3c0cdca0c7ee5d288ee75e5. Nov 24 06:56:17.073809 systemd[1]: Started cri-containerd-5d3fd599c029d0272b272812670718ec12d4bbd9f9784ffc13f0140e6565cec4.scope - libcontainer container 5d3fd599c029d0272b272812670718ec12d4bbd9f9784ffc13f0140e6565cec4. Nov 24 06:56:17.076984 kubelet[2546]: I1124 06:56:17.076969 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:56:17.077333 kubelet[2546]: E1124 06:56:17.077309 2546 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.102:6443/api/v1/nodes\": dial tcp 139.178.70.102:6443: connect: connection refused" node="localhost" Nov 24 06:56:17.087313 systemd[1]: Started cri-containerd-28fa1d97571667f21a83d2d94b8d62468e2301cccfd0c0ac1f55a8c332f9a4af.scope - libcontainer container 28fa1d97571667f21a83d2d94b8d62468e2301cccfd0c0ac1f55a8c332f9a4af. Nov 24 06:56:17.126131 containerd[1621]: time="2025-11-24T06:56:17.126111634Z" level=info msg="StartContainer for \"28fa1d97571667f21a83d2d94b8d62468e2301cccfd0c0ac1f55a8c332f9a4af\" returns successfully" Nov 24 06:56:17.137537 containerd[1621]: time="2025-11-24T06:56:17.137501329Z" level=info msg="StartContainer for \"5d3fd599c029d0272b272812670718ec12d4bbd9f9784ffc13f0140e6565cec4\" returns successfully" Nov 24 06:56:17.147758 containerd[1621]: time="2025-11-24T06:56:17.147724931Z" level=info msg="StartContainer for \"b3090aa400b9170ff95755440465d7cac63c6b97d3c0cdca0c7ee5d288ee75e5\" returns successfully" Nov 24 06:56:17.237049 kubelet[2546]: E1124 06:56:17.237026 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Nov 24 06:56:17.346195 kubelet[2546]: E1124 06:56:17.346176 2546 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:56:17.346351 kubelet[2546]: E1124 06:56:17.346342 2546 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:56:17.349378 kubelet[2546]: E1124 06:56:17.349365 2546 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:56:17.560100 kubelet[2546]: E1124 06:56:17.560072 2546 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.102:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Nov 24 06:56:17.879870 kubelet[2546]: I1124 06:56:17.879778 2546 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:56:18.350921 kubelet[2546]: E1124 06:56:18.350827 2546 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:56:18.351302 kubelet[2546]: E1124 06:56:18.351236 2546 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Nov 24 06:56:18.589909 kubelet[2546]: E1124 06:56:18.589880 2546 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Nov 24 06:56:18.658283 kubelet[2546]: I1124 06:56:18.658097 2546 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Nov 24 06:56:18.658283 kubelet[2546]: E1124 06:56:18.658124 2546 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Nov 24 06:56:18.713319 kubelet[2546]: I1124 06:56:18.713253 2546 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:18.717327 kubelet[2546]: E1124 06:56:18.717300 2546 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:18.717327 kubelet[2546]: I1124 06:56:18.717317 2546 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:18.718121 kubelet[2546]: E1124 06:56:18.718104 2546 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:18.718121 kubelet[2546]: I1124 06:56:18.718117 2546 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:18.718977 kubelet[2546]: E1124 06:56:18.718964 2546 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:19.295807 kubelet[2546]: I1124 06:56:19.295784 2546 apiserver.go:52] "Watching apiserver" Nov 24 06:56:19.314646 kubelet[2546]: I1124 06:56:19.314599 2546 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Nov 24 06:56:19.623492 kubelet[2546]: I1124 06:56:19.623413 2546 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:20.030756 kubelet[2546]: I1124 06:56:20.030604 2546 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:20.314127 systemd[1]: Reload requested from client PID 2821 ('systemctl') (unit session-9.scope)... Nov 24 06:56:20.314140 systemd[1]: Reloading... Nov 24 06:56:20.375687 zram_generator::config[2866]: No configuration found. Nov 24 06:56:20.463126 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Nov 24 06:56:20.544020 systemd[1]: Reloading finished in 229 ms. Nov 24 06:56:20.571753 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:56:20.587497 systemd[1]: kubelet.service: Deactivated successfully. Nov 24 06:56:20.587678 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:56:20.587711 systemd[1]: kubelet.service: Consumed 784ms CPU time, 131.1M memory peak. Nov 24 06:56:20.588982 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Nov 24 06:56:20.956545 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Nov 24 06:56:20.966337 (kubelet)[2932]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Nov 24 06:56:21.032460 kubelet[2932]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 06:56:21.032695 kubelet[2932]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Nov 24 06:56:21.032728 kubelet[2932]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 24 06:56:21.032822 kubelet[2932]: I1124 06:56:21.032802 2932 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 24 06:56:21.036841 kubelet[2932]: I1124 06:56:21.036819 2932 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Nov 24 06:56:21.036841 kubelet[2932]: I1124 06:56:21.036836 2932 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 24 06:56:21.036993 kubelet[2932]: I1124 06:56:21.036980 2932 server.go:956] "Client rotation is on, will bootstrap in background" Nov 24 06:56:21.037756 kubelet[2932]: I1124 06:56:21.037743 2932 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Nov 24 06:56:21.053171 kubelet[2932]: I1124 06:56:21.053072 2932 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Nov 24 06:56:21.064045 kubelet[2932]: I1124 06:56:21.064031 2932 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 24 06:56:21.067301 kubelet[2932]: I1124 06:56:21.066555 2932 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Nov 24 06:56:21.067301 kubelet[2932]: I1124 06:56:21.066689 2932 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 24 06:56:21.067301 kubelet[2932]: I1124 06:56:21.066705 2932 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 24 06:56:21.067301 kubelet[2932]: I1124 06:56:21.066847 2932 topology_manager.go:138] "Creating topology manager with none policy" Nov 24 06:56:21.067458 kubelet[2932]: I1124 06:56:21.066854 2932 container_manager_linux.go:303] "Creating device plugin manager" Nov 24 06:56:21.067458 kubelet[2932]: I1124 06:56:21.066884 2932 state_mem.go:36] "Initialized new in-memory state store" Nov 24 06:56:21.067458 kubelet[2932]: I1124 06:56:21.067018 2932 kubelet.go:480] "Attempting to sync node with API server" Nov 24 06:56:21.067458 kubelet[2932]: I1124 06:56:21.067027 2932 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 24 06:56:21.067458 kubelet[2932]: I1124 06:56:21.067044 2932 kubelet.go:386] "Adding apiserver pod source" Nov 24 06:56:21.067458 kubelet[2932]: I1124 06:56:21.067053 2932 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 24 06:56:21.080875 kubelet[2932]: I1124 06:56:21.080855 2932 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Nov 24 06:56:21.081188 kubelet[2932]: I1124 06:56:21.081176 2932 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Nov 24 06:56:21.082770 kubelet[2932]: I1124 06:56:21.082760 2932 watchdog_linux.go:99] "Systemd watchdog is not enabled" Nov 24 06:56:21.082807 kubelet[2932]: I1124 06:56:21.082784 2932 server.go:1289] "Started kubelet" Nov 24 06:56:21.083704 kubelet[2932]: I1124 06:56:21.083689 2932 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 24 06:56:21.095941 kubelet[2932]: I1124 06:56:21.095897 2932 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Nov 24 06:56:21.097014 kubelet[2932]: I1124 06:56:21.096971 2932 server.go:317] "Adding debug handlers to kubelet server" Nov 24 06:56:21.099019 kubelet[2932]: I1124 06:56:21.099001 2932 volume_manager.go:297] "Starting Kubelet Volume Manager" Nov 24 06:56:21.099097 kubelet[2932]: I1124 06:56:21.098996 2932 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 24 06:56:21.099237 kubelet[2932]: I1124 06:56:21.099229 2932 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 24 06:56:21.099549 kubelet[2932]: I1124 06:56:21.099540 2932 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Nov 24 06:56:21.101160 kubelet[2932]: I1124 06:56:21.101136 2932 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Nov 24 06:56:21.101304 kubelet[2932]: I1124 06:56:21.101224 2932 reconciler.go:26] "Reconciler: start to sync state" Nov 24 06:56:21.103082 kubelet[2932]: I1124 06:56:21.103063 2932 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Nov 24 06:56:21.103782 kubelet[2932]: I1124 06:56:21.103764 2932 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Nov 24 06:56:21.103815 kubelet[2932]: I1124 06:56:21.103783 2932 status_manager.go:230] "Starting to sync pod status with apiserver" Nov 24 06:56:21.103815 kubelet[2932]: I1124 06:56:21.103795 2932 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Nov 24 06:56:21.103815 kubelet[2932]: I1124 06:56:21.103799 2932 kubelet.go:2436] "Starting kubelet main sync loop" Nov 24 06:56:21.103871 kubelet[2932]: E1124 06:56:21.103821 2932 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 24 06:56:21.108824 kubelet[2932]: I1124 06:56:21.108696 2932 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Nov 24 06:56:21.112002 kubelet[2932]: E1124 06:56:21.111979 2932 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Nov 24 06:56:21.120639 kubelet[2932]: I1124 06:56:21.119321 2932 factory.go:223] Registration of the containerd container factory successfully Nov 24 06:56:21.120639 kubelet[2932]: I1124 06:56:21.119336 2932 factory.go:223] Registration of the systemd container factory successfully Nov 24 06:56:21.151492 kubelet[2932]: I1124 06:56:21.151470 2932 cpu_manager.go:221] "Starting CPU manager" policy="none" Nov 24 06:56:21.151492 kubelet[2932]: I1124 06:56:21.151484 2932 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Nov 24 06:56:21.151492 kubelet[2932]: I1124 06:56:21.151496 2932 state_mem.go:36] "Initialized new in-memory state store" Nov 24 06:56:21.151641 kubelet[2932]: I1124 06:56:21.151623 2932 state_mem.go:88] "Updated default CPUSet" cpuSet="" Nov 24 06:56:21.151676 kubelet[2932]: I1124 06:56:21.151634 2932 state_mem.go:96] "Updated CPUSet assignments" assignments={} Nov 24 06:56:21.151676 kubelet[2932]: I1124 06:56:21.151648 2932 policy_none.go:49] "None policy: Start" Nov 24 06:56:21.151676 kubelet[2932]: I1124 06:56:21.151655 2932 memory_manager.go:186] "Starting memorymanager" policy="None" Nov 24 06:56:21.151676 kubelet[2932]: I1124 06:56:21.151662 2932 state_mem.go:35] "Initializing new in-memory state store" Nov 24 06:56:21.151749 kubelet[2932]: I1124 06:56:21.151738 2932 state_mem.go:75] "Updated machine memory state" Nov 24 06:56:21.154285 kubelet[2932]: E1124 06:56:21.154273 2932 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Nov 24 06:56:21.154729 kubelet[2932]: I1124 06:56:21.154598 2932 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 24 06:56:21.154729 kubelet[2932]: I1124 06:56:21.154606 2932 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 24 06:56:21.154784 kubelet[2932]: I1124 06:56:21.154750 2932 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 24 06:56:21.155873 kubelet[2932]: E1124 06:56:21.155862 2932 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Nov 24 06:56:21.205175 kubelet[2932]: I1124 06:56:21.205147 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:21.205586 kubelet[2932]: I1124 06:56:21.205433 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:21.205762 kubelet[2932]: I1124 06:56:21.205480 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:21.210087 kubelet[2932]: E1124 06:56:21.209359 2932 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:21.210199 kubelet[2932]: E1124 06:56:21.209480 2932 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:21.258433 kubelet[2932]: I1124 06:56:21.258400 2932 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Nov 24 06:56:21.263186 kubelet[2932]: I1124 06:56:21.263123 2932 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Nov 24 06:56:21.263354 kubelet[2932]: I1124 06:56:21.263207 2932 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Nov 24 06:56:21.402807 kubelet[2932]: I1124 06:56:21.402615 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:21.402807 kubelet[2932]: I1124 06:56:21.402675 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:21.402807 kubelet[2932]: I1124 06:56:21.402691 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e51b49401d7e125d16957469facd7352-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"e51b49401d7e125d16957469facd7352\") " pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:21.402807 kubelet[2932]: I1124 06:56:21.402703 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:21.402807 kubelet[2932]: I1124 06:56:21.402715 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/19161de957280c578b5c0a74a63c4967-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"19161de957280c578b5c0a74a63c4967\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:21.402969 kubelet[2932]: I1124 06:56:21.402728 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/19161de957280c578b5c0a74a63c4967-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"19161de957280c578b5c0a74a63c4967\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:21.402969 kubelet[2932]: I1124 06:56:21.402739 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/19161de957280c578b5c0a74a63c4967-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"19161de957280c578b5c0a74a63c4967\") " pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:21.402969 kubelet[2932]: I1124 06:56:21.402749 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:21.402969 kubelet[2932]: I1124 06:56:21.402762 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1d5832191310254249cf17c2353d71ec-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"1d5832191310254249cf17c2353d71ec\") " pod="kube-system/kube-controller-manager-localhost" Nov 24 06:56:22.087138 kubelet[2932]: I1124 06:56:22.086959 2932 apiserver.go:52] "Watching apiserver" Nov 24 06:56:22.101847 kubelet[2932]: I1124 06:56:22.101803 2932 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Nov 24 06:56:22.120813 kubelet[2932]: I1124 06:56:22.120787 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:22.121036 kubelet[2932]: I1124 06:56:22.121028 2932 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:22.127118 kubelet[2932]: E1124 06:56:22.127093 2932 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Nov 24 06:56:22.127640 kubelet[2932]: E1124 06:56:22.127542 2932 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Nov 24 06:56:22.144011 kubelet[2932]: I1124 06:56:22.143935 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.143912778 podStartE2EDuration="1.143912778s" podCreationTimestamp="2025-11-24 06:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:56:22.133885611 +0000 UTC m=+1.148957422" watchObservedRunningTime="2025-11-24 06:56:22.143912778 +0000 UTC m=+1.158984580" Nov 24 06:56:22.144638 kubelet[2932]: I1124 06:56:22.144612 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.144606059 podStartE2EDuration="3.144606059s" podCreationTimestamp="2025-11-24 06:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:56:22.144347217 +0000 UTC m=+1.159419019" watchObservedRunningTime="2025-11-24 06:56:22.144606059 +0000 UTC m=+1.159677870" Nov 24 06:56:22.162094 kubelet[2932]: I1124 06:56:22.162005 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.161989826 podStartE2EDuration="2.161989826s" podCreationTimestamp="2025-11-24 06:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:56:22.150977033 +0000 UTC m=+1.166048839" watchObservedRunningTime="2025-11-24 06:56:22.161989826 +0000 UTC m=+1.177061628" Nov 24 06:56:27.605814 kubelet[2932]: I1124 06:56:27.605787 2932 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Nov 24 06:56:27.606902 containerd[1621]: time="2025-11-24T06:56:27.606304896Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Nov 24 06:56:27.607082 kubelet[2932]: I1124 06:56:27.606436 2932 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Nov 24 06:56:28.346646 systemd[1]: Created slice kubepods-besteffort-podafc8df1e_7c63_4c7f_ac6c_fb52af81f4c6.slice - libcontainer container kubepods-besteffort-podafc8df1e_7c63_4c7f_ac6c_fb52af81f4c6.slice. Nov 24 06:56:28.449334 kubelet[2932]: I1124 06:56:28.449307 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5c49\" (UniqueName: \"kubernetes.io/projected/afc8df1e-7c63-4c7f-ac6c-fb52af81f4c6-kube-api-access-n5c49\") pod \"tigera-operator-7dcd859c48-mj5qk\" (UID: \"afc8df1e-7c63-4c7f-ac6c-fb52af81f4c6\") " pod="tigera-operator/tigera-operator-7dcd859c48-mj5qk" Nov 24 06:56:28.449334 kubelet[2932]: I1124 06:56:28.449335 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/afc8df1e-7c63-4c7f-ac6c-fb52af81f4c6-var-lib-calico\") pod \"tigera-operator-7dcd859c48-mj5qk\" (UID: \"afc8df1e-7c63-4c7f-ac6c-fb52af81f4c6\") " pod="tigera-operator/tigera-operator-7dcd859c48-mj5qk" Nov 24 06:56:28.506047 kubelet[2932]: I1124 06:56:28.505933 2932 status_manager.go:895] "Failed to get status for pod" podUID="6d961835-54f7-4f29-a188-dc2492f544ac" pod="kube-system/kube-proxy-cnbh6" err="pods \"kube-proxy-cnbh6\" is forbidden: User \"system:node:localhost\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'localhost' and this object" Nov 24 06:56:28.506047 kubelet[2932]: E1124 06:56:28.505940 2932 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-proxy\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"kube-proxy\"" type="*v1.ConfigMap" Nov 24 06:56:28.510773 systemd[1]: Created slice kubepods-besteffort-pod6d961835_54f7_4f29_a188_dc2492f544ac.slice - libcontainer container kubepods-besteffort-pod6d961835_54f7_4f29_a188_dc2492f544ac.slice. Nov 24 06:56:28.549744 kubelet[2932]: I1124 06:56:28.549711 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6d961835-54f7-4f29-a188-dc2492f544ac-xtables-lock\") pod \"kube-proxy-cnbh6\" (UID: \"6d961835-54f7-4f29-a188-dc2492f544ac\") " pod="kube-system/kube-proxy-cnbh6" Nov 24 06:56:28.549744 kubelet[2932]: I1124 06:56:28.549745 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d961835-54f7-4f29-a188-dc2492f544ac-lib-modules\") pod \"kube-proxy-cnbh6\" (UID: \"6d961835-54f7-4f29-a188-dc2492f544ac\") " pod="kube-system/kube-proxy-cnbh6" Nov 24 06:56:28.549877 kubelet[2932]: I1124 06:56:28.549780 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpkpm\" (UniqueName: \"kubernetes.io/projected/6d961835-54f7-4f29-a188-dc2492f544ac-kube-api-access-fpkpm\") pod \"kube-proxy-cnbh6\" (UID: \"6d961835-54f7-4f29-a188-dc2492f544ac\") " pod="kube-system/kube-proxy-cnbh6" Nov 24 06:56:28.549877 kubelet[2932]: I1124 06:56:28.549799 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6d961835-54f7-4f29-a188-dc2492f544ac-kube-proxy\") pod \"kube-proxy-cnbh6\" (UID: \"6d961835-54f7-4f29-a188-dc2492f544ac\") " pod="kube-system/kube-proxy-cnbh6" Nov 24 06:56:28.654477 containerd[1621]: time="2025-11-24T06:56:28.654192910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mj5qk,Uid:afc8df1e-7c63-4c7f-ac6c-fb52af81f4c6,Namespace:tigera-operator,Attempt:0,}" Nov 24 06:56:28.669563 containerd[1621]: time="2025-11-24T06:56:28.669500764Z" level=info msg="connecting to shim c1e691586f4d587ad5995c7f3ed9af62553ad61016ccf704acaf4c6619e7475d" address="unix:///run/containerd/s/8351cd30c2faf08fdd257b7ada586fb4df45766a81936ca21c3463f465e4ee3e" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:56:28.690736 systemd[1]: Started cri-containerd-c1e691586f4d587ad5995c7f3ed9af62553ad61016ccf704acaf4c6619e7475d.scope - libcontainer container c1e691586f4d587ad5995c7f3ed9af62553ad61016ccf704acaf4c6619e7475d. Nov 24 06:56:28.725782 containerd[1621]: time="2025-11-24T06:56:28.725745786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-mj5qk,Uid:afc8df1e-7c63-4c7f-ac6c-fb52af81f4c6,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c1e691586f4d587ad5995c7f3ed9af62553ad61016ccf704acaf4c6619e7475d\"" Nov 24 06:56:28.726711 containerd[1621]: time="2025-11-24T06:56:28.726694702Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Nov 24 06:56:29.713640 containerd[1621]: time="2025-11-24T06:56:29.713540005Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cnbh6,Uid:6d961835-54f7-4f29-a188-dc2492f544ac,Namespace:kube-system,Attempt:0,}" Nov 24 06:56:29.848499 containerd[1621]: time="2025-11-24T06:56:29.848466225Z" level=info msg="connecting to shim 01618b4a47840b5eeb20dd52cf48fd40bb9d8d2d5b93473b41c96849729bfee5" address="unix:///run/containerd/s/63ca8f6557b4240e6e6944244e4612d643ee85e3adce983f3034d2fdb3a757ec" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:56:29.869712 systemd[1]: Started cri-containerd-01618b4a47840b5eeb20dd52cf48fd40bb9d8d2d5b93473b41c96849729bfee5.scope - libcontainer container 01618b4a47840b5eeb20dd52cf48fd40bb9d8d2d5b93473b41c96849729bfee5. Nov 24 06:56:29.912135 containerd[1621]: time="2025-11-24T06:56:29.912103795Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cnbh6,Uid:6d961835-54f7-4f29-a188-dc2492f544ac,Namespace:kube-system,Attempt:0,} returns sandbox id \"01618b4a47840b5eeb20dd52cf48fd40bb9d8d2d5b93473b41c96849729bfee5\"" Nov 24 06:56:29.928090 containerd[1621]: time="2025-11-24T06:56:29.928049924Z" level=info msg="CreateContainer within sandbox \"01618b4a47840b5eeb20dd52cf48fd40bb9d8d2d5b93473b41c96849729bfee5\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Nov 24 06:56:29.966538 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1092395460.mount: Deactivated successfully. Nov 24 06:56:29.971083 containerd[1621]: time="2025-11-24T06:56:29.971062293Z" level=info msg="Container 9a121340401ff7f676bf4ea43d0ea37ce529bbc4f834fddec0e16e0d5be53e5d: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:30.055202 containerd[1621]: time="2025-11-24T06:56:30.055164234Z" level=info msg="CreateContainer within sandbox \"01618b4a47840b5eeb20dd52cf48fd40bb9d8d2d5b93473b41c96849729bfee5\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9a121340401ff7f676bf4ea43d0ea37ce529bbc4f834fddec0e16e0d5be53e5d\"" Nov 24 06:56:30.056085 containerd[1621]: time="2025-11-24T06:56:30.056064984Z" level=info msg="StartContainer for \"9a121340401ff7f676bf4ea43d0ea37ce529bbc4f834fddec0e16e0d5be53e5d\"" Nov 24 06:56:30.057849 containerd[1621]: time="2025-11-24T06:56:30.057827947Z" level=info msg="connecting to shim 9a121340401ff7f676bf4ea43d0ea37ce529bbc4f834fddec0e16e0d5be53e5d" address="unix:///run/containerd/s/63ca8f6557b4240e6e6944244e4612d643ee85e3adce983f3034d2fdb3a757ec" protocol=ttrpc version=3 Nov 24 06:56:30.075740 systemd[1]: Started cri-containerd-9a121340401ff7f676bf4ea43d0ea37ce529bbc4f834fddec0e16e0d5be53e5d.scope - libcontainer container 9a121340401ff7f676bf4ea43d0ea37ce529bbc4f834fddec0e16e0d5be53e5d. Nov 24 06:56:30.133830 containerd[1621]: time="2025-11-24T06:56:30.133743719Z" level=info msg="StartContainer for \"9a121340401ff7f676bf4ea43d0ea37ce529bbc4f834fddec0e16e0d5be53e5d\" returns successfully" Nov 24 06:56:30.843072 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2556329953.mount: Deactivated successfully. Nov 24 06:56:31.036356 containerd[1621]: time="2025-11-24T06:56:31.036330381Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:31.036816 containerd[1621]: time="2025-11-24T06:56:31.036802015Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Nov 24 06:56:31.037050 containerd[1621]: time="2025-11-24T06:56:31.037035416Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:31.038036 containerd[1621]: time="2025-11-24T06:56:31.038021645Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:31.038552 containerd[1621]: time="2025-11-24T06:56:31.038476894Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.31176158s" Nov 24 06:56:31.038552 containerd[1621]: time="2025-11-24T06:56:31.038495362Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Nov 24 06:56:31.040934 containerd[1621]: time="2025-11-24T06:56:31.040916198Z" level=info msg="CreateContainer within sandbox \"c1e691586f4d587ad5995c7f3ed9af62553ad61016ccf704acaf4c6619e7475d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Nov 24 06:56:31.046026 containerd[1621]: time="2025-11-24T06:56:31.046007490Z" level=info msg="Container e89a78f6eded2ff3b87eb6b07fc4a556b9109bc22133dd755afe3707ce05e44a: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:31.046343 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount564502530.mount: Deactivated successfully. Nov 24 06:56:31.053255 containerd[1621]: time="2025-11-24T06:56:31.053232105Z" level=info msg="CreateContainer within sandbox \"c1e691586f4d587ad5995c7f3ed9af62553ad61016ccf704acaf4c6619e7475d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e89a78f6eded2ff3b87eb6b07fc4a556b9109bc22133dd755afe3707ce05e44a\"" Nov 24 06:56:31.054187 containerd[1621]: time="2025-11-24T06:56:31.054118836Z" level=info msg="StartContainer for \"e89a78f6eded2ff3b87eb6b07fc4a556b9109bc22133dd755afe3707ce05e44a\"" Nov 24 06:56:31.054674 containerd[1621]: time="2025-11-24T06:56:31.054662303Z" level=info msg="connecting to shim e89a78f6eded2ff3b87eb6b07fc4a556b9109bc22133dd755afe3707ce05e44a" address="unix:///run/containerd/s/8351cd30c2faf08fdd257b7ada586fb4df45766a81936ca21c3463f465e4ee3e" protocol=ttrpc version=3 Nov 24 06:56:31.077720 systemd[1]: Started cri-containerd-e89a78f6eded2ff3b87eb6b07fc4a556b9109bc22133dd755afe3707ce05e44a.scope - libcontainer container e89a78f6eded2ff3b87eb6b07fc4a556b9109bc22133dd755afe3707ce05e44a. Nov 24 06:56:31.096910 containerd[1621]: time="2025-11-24T06:56:31.096831156Z" level=info msg="StartContainer for \"e89a78f6eded2ff3b87eb6b07fc4a556b9109bc22133dd755afe3707ce05e44a\" returns successfully" Nov 24 06:56:31.145686 kubelet[2932]: I1124 06:56:31.145333 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-mj5qk" podStartSLOduration=0.832774971 podStartE2EDuration="3.145322612s" podCreationTimestamp="2025-11-24 06:56:28 +0000 UTC" firstStartedPulling="2025-11-24 06:56:28.726422538 +0000 UTC m=+7.741494337" lastFinishedPulling="2025-11-24 06:56:31.038970177 +0000 UTC m=+10.054041978" observedRunningTime="2025-11-24 06:56:31.14523389 +0000 UTC m=+10.160305693" watchObservedRunningTime="2025-11-24 06:56:31.145322612 +0000 UTC m=+10.160394423" Nov 24 06:56:32.666580 kubelet[2932]: I1124 06:56:32.666545 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-cnbh6" podStartSLOduration=4.666535101 podStartE2EDuration="4.666535101s" podCreationTimestamp="2025-11-24 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:56:31.15158397 +0000 UTC m=+10.166655776" watchObservedRunningTime="2025-11-24 06:56:32.666535101 +0000 UTC m=+11.681606906" Nov 24 06:56:37.512428 sudo[1942]: pam_unix(sudo:session): session closed for user root Nov 24 06:56:37.513721 sshd[1941]: Connection closed by 147.75.109.163 port 45480 Nov 24 06:56:37.515105 sshd-session[1938]: pam_unix(sshd:session): session closed for user core Nov 24 06:56:37.519476 systemd[1]: sshd@6-139.178.70.102:22-147.75.109.163:45480.service: Deactivated successfully. Nov 24 06:56:37.522647 systemd[1]: session-9.scope: Deactivated successfully. Nov 24 06:56:37.523405 systemd[1]: session-9.scope: Consumed 3.153s CPU time, 154.7M memory peak. Nov 24 06:56:37.524885 systemd-logind[1591]: Session 9 logged out. Waiting for processes to exit. Nov 24 06:56:37.526435 systemd-logind[1591]: Removed session 9. Nov 24 06:56:40.644013 systemd[1]: Created slice kubepods-besteffort-pod9cb7da2c_1dbe_4197_9328_4b260673fd71.slice - libcontainer container kubepods-besteffort-pod9cb7da2c_1dbe_4197_9328_4b260673fd71.slice. Nov 24 06:56:40.727305 kubelet[2932]: I1124 06:56:40.727267 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9cb7da2c-1dbe-4197-9328-4b260673fd71-typha-certs\") pod \"calico-typha-bc5d8c879-qkp8f\" (UID: \"9cb7da2c-1dbe-4197-9328-4b260673fd71\") " pod="calico-system/calico-typha-bc5d8c879-qkp8f" Nov 24 06:56:40.727305 kubelet[2932]: I1124 06:56:40.727306 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdrz\" (UniqueName: \"kubernetes.io/projected/9cb7da2c-1dbe-4197-9328-4b260673fd71-kube-api-access-trdrz\") pod \"calico-typha-bc5d8c879-qkp8f\" (UID: \"9cb7da2c-1dbe-4197-9328-4b260673fd71\") " pod="calico-system/calico-typha-bc5d8c879-qkp8f" Nov 24 06:56:40.727615 kubelet[2932]: I1124 06:56:40.727331 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cb7da2c-1dbe-4197-9328-4b260673fd71-tigera-ca-bundle\") pod \"calico-typha-bc5d8c879-qkp8f\" (UID: \"9cb7da2c-1dbe-4197-9328-4b260673fd71\") " pod="calico-system/calico-typha-bc5d8c879-qkp8f" Nov 24 06:56:40.908741 systemd[1]: Created slice kubepods-besteffort-podaa3ea5c9_d165_4def_88c8_dc67afacb115.slice - libcontainer container kubepods-besteffort-podaa3ea5c9_d165_4def_88c8_dc67afacb115.slice. Nov 24 06:56:40.928825 kubelet[2932]: I1124 06:56:40.928794 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/aa3ea5c9-d165-4def-88c8-dc67afacb115-node-certs\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929157 kubelet[2932]: I1124 06:56:40.928982 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-cni-bin-dir\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929157 kubelet[2932]: I1124 06:56:40.928996 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-cni-net-dir\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929157 kubelet[2932]: I1124 06:56:40.929005 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-var-lib-calico\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929157 kubelet[2932]: I1124 06:56:40.929013 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa3ea5c9-d165-4def-88c8-dc67afacb115-tigera-ca-bundle\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929157 kubelet[2932]: I1124 06:56:40.929063 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-flexvol-driver-host\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929259 kubelet[2932]: I1124 06:56:40.929076 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-policysync\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929377 kubelet[2932]: I1124 06:56:40.929314 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-var-run-calico\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929377 kubelet[2932]: I1124 06:56:40.929333 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-xtables-lock\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929377 kubelet[2932]: I1124 06:56:40.929353 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-cni-log-dir\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929377 kubelet[2932]: I1124 06:56:40.929362 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa3ea5c9-d165-4def-88c8-dc67afacb115-lib-modules\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.929491 kubelet[2932]: I1124 06:56:40.929462 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqs7\" (UniqueName: \"kubernetes.io/projected/aa3ea5c9-d165-4def-88c8-dc67afacb115-kube-api-access-4nqs7\") pod \"calico-node-wlp6h\" (UID: \"aa3ea5c9-d165-4def-88c8-dc67afacb115\") " pod="calico-system/calico-node-wlp6h" Nov 24 06:56:40.963851 containerd[1621]: time="2025-11-24T06:56:40.963817263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-bc5d8c879-qkp8f,Uid:9cb7da2c-1dbe-4197-9328-4b260673fd71,Namespace:calico-system,Attempt:0,}" Nov 24 06:56:40.982287 containerd[1621]: time="2025-11-24T06:56:40.982251342Z" level=info msg="connecting to shim 84341f7feb5f6c973f6a903565a66bb173ee094adb11837b6c2945149edc8d17" address="unix:///run/containerd/s/98915de8db30ac52582f0bb09dffda655dd7af76a5c26c4f0a991008066a8dcb" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:56:41.004950 systemd[1]: Started cri-containerd-84341f7feb5f6c973f6a903565a66bb173ee094adb11837b6c2945149edc8d17.scope - libcontainer container 84341f7feb5f6c973f6a903565a66bb173ee094adb11837b6c2945149edc8d17. Nov 24 06:56:41.040529 kubelet[2932]: E1124 06:56:41.040513 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.040794 kubelet[2932]: W1124 06:56:41.040632 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.045656 kubelet[2932]: E1124 06:56:41.045142 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.046530 kubelet[2932]: E1124 06:56:41.046371 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.046530 kubelet[2932]: W1124 06:56:41.046381 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.046530 kubelet[2932]: E1124 06:56:41.046391 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.058045 containerd[1621]: time="2025-11-24T06:56:41.057994134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-bc5d8c879-qkp8f,Uid:9cb7da2c-1dbe-4197-9328-4b260673fd71,Namespace:calico-system,Attempt:0,} returns sandbox id \"84341f7feb5f6c973f6a903565a66bb173ee094adb11837b6c2945149edc8d17\"" Nov 24 06:56:41.059713 containerd[1621]: time="2025-11-24T06:56:41.059681915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Nov 24 06:56:41.120723 kubelet[2932]: E1124 06:56:41.120692 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:56:41.212232 containerd[1621]: time="2025-11-24T06:56:41.212107838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wlp6h,Uid:aa3ea5c9-d165-4def-88c8-dc67afacb115,Namespace:calico-system,Attempt:0,}" Nov 24 06:56:41.222448 kubelet[2932]: E1124 06:56:41.222264 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.222448 kubelet[2932]: W1124 06:56:41.222381 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.222448 kubelet[2932]: E1124 06:56:41.222401 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.222933 kubelet[2932]: E1124 06:56:41.222908 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.223099 kubelet[2932]: W1124 06:56:41.222918 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.223099 kubelet[2932]: E1124 06:56:41.222979 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.225542 kubelet[2932]: E1124 06:56:41.223367 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.225542 kubelet[2932]: W1124 06:56:41.223374 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.225542 kubelet[2932]: E1124 06:56:41.223380 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.225542 kubelet[2932]: E1124 06:56:41.223534 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.225542 kubelet[2932]: W1124 06:56:41.223539 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.225542 kubelet[2932]: E1124 06:56:41.223545 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.225542 kubelet[2932]: E1124 06:56:41.224062 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.225542 kubelet[2932]: W1124 06:56:41.224069 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.225542 kubelet[2932]: E1124 06:56:41.224075 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.225542 kubelet[2932]: E1124 06:56:41.224185 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230177 containerd[1621]: time="2025-11-24T06:56:41.224743465Z" level=info msg="connecting to shim 0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a" address="unix:///run/containerd/s/1c986947fb8b3e68d47495aa38e566039470b9d2f78d5b20f764c4e1208381f5" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:56:41.230207 kubelet[2932]: W1124 06:56:41.224189 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230207 kubelet[2932]: E1124 06:56:41.224194 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230207 kubelet[2932]: E1124 06:56:41.224554 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230207 kubelet[2932]: W1124 06:56:41.224560 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230207 kubelet[2932]: E1124 06:56:41.224566 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230207 kubelet[2932]: E1124 06:56:41.225029 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230207 kubelet[2932]: W1124 06:56:41.225037 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230207 kubelet[2932]: E1124 06:56:41.225056 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230207 kubelet[2932]: E1124 06:56:41.225167 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230207 kubelet[2932]: W1124 06:56:41.225193 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230392 kubelet[2932]: E1124 06:56:41.225204 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230392 kubelet[2932]: E1124 06:56:41.225315 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230392 kubelet[2932]: W1124 06:56:41.225320 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230392 kubelet[2932]: E1124 06:56:41.225325 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230392 kubelet[2932]: E1124 06:56:41.225421 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230392 kubelet[2932]: W1124 06:56:41.225427 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230392 kubelet[2932]: E1124 06:56:41.225440 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230392 kubelet[2932]: E1124 06:56:41.225711 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230392 kubelet[2932]: W1124 06:56:41.225717 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230392 kubelet[2932]: E1124 06:56:41.225723 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230584 kubelet[2932]: E1124 06:56:41.225921 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230584 kubelet[2932]: W1124 06:56:41.225927 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230584 kubelet[2932]: E1124 06:56:41.225950 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230584 kubelet[2932]: E1124 06:56:41.226058 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230584 kubelet[2932]: W1124 06:56:41.226064 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230584 kubelet[2932]: E1124 06:56:41.226070 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230584 kubelet[2932]: E1124 06:56:41.226166 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230584 kubelet[2932]: W1124 06:56:41.226172 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230584 kubelet[2932]: E1124 06:56:41.226179 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230584 kubelet[2932]: E1124 06:56:41.226360 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230795 kubelet[2932]: W1124 06:56:41.226365 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230795 kubelet[2932]: E1124 06:56:41.226370 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230795 kubelet[2932]: E1124 06:56:41.226464 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230795 kubelet[2932]: W1124 06:56:41.226469 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230795 kubelet[2932]: E1124 06:56:41.226474 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230795 kubelet[2932]: E1124 06:56:41.226673 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230795 kubelet[2932]: W1124 06:56:41.226680 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.230795 kubelet[2932]: E1124 06:56:41.226687 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.230795 kubelet[2932]: E1124 06:56:41.226776 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.230795 kubelet[2932]: W1124 06:56:41.226780 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.231512 kubelet[2932]: E1124 06:56:41.226785 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.231512 kubelet[2932]: E1124 06:56:41.226898 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.231512 kubelet[2932]: W1124 06:56:41.226903 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.231512 kubelet[2932]: E1124 06:56:41.226907 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.231512 kubelet[2932]: E1124 06:56:41.231276 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.231512 kubelet[2932]: W1124 06:56:41.231284 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.231512 kubelet[2932]: E1124 06:56:41.231294 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.231512 kubelet[2932]: I1124 06:56:41.231321 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ec589a89-1333-4d00-aa6a-417830a62536-kubelet-dir\") pod \"csi-node-driver-hsxdh\" (UID: \"ec589a89-1333-4d00-aa6a-417830a62536\") " pod="calico-system/csi-node-driver-hsxdh" Nov 24 06:56:41.231512 kubelet[2932]: E1124 06:56:41.231448 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.231761 kubelet[2932]: W1124 06:56:41.231454 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.231761 kubelet[2932]: E1124 06:56:41.231459 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.231761 kubelet[2932]: I1124 06:56:41.231475 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ec589a89-1333-4d00-aa6a-417830a62536-registration-dir\") pod \"csi-node-driver-hsxdh\" (UID: \"ec589a89-1333-4d00-aa6a-417830a62536\") " pod="calico-system/csi-node-driver-hsxdh" Nov 24 06:56:41.231951 kubelet[2932]: E1124 06:56:41.231843 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.231951 kubelet[2932]: W1124 06:56:41.231851 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.231951 kubelet[2932]: E1124 06:56:41.231858 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.231951 kubelet[2932]: I1124 06:56:41.231873 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqh2z\" (UniqueName: \"kubernetes.io/projected/ec589a89-1333-4d00-aa6a-417830a62536-kube-api-access-vqh2z\") pod \"csi-node-driver-hsxdh\" (UID: \"ec589a89-1333-4d00-aa6a-417830a62536\") " pod="calico-system/csi-node-driver-hsxdh" Nov 24 06:56:41.232225 kubelet[2932]: E1124 06:56:41.232185 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.232225 kubelet[2932]: W1124 06:56:41.232196 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.232274 kubelet[2932]: E1124 06:56:41.232204 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.232350 kubelet[2932]: I1124 06:56:41.232316 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ec589a89-1333-4d00-aa6a-417830a62536-socket-dir\") pod \"csi-node-driver-hsxdh\" (UID: \"ec589a89-1333-4d00-aa6a-417830a62536\") " pod="calico-system/csi-node-driver-hsxdh" Nov 24 06:56:41.232515 kubelet[2932]: E1124 06:56:41.232507 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.232580 kubelet[2932]: W1124 06:56:41.232542 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.232580 kubelet[2932]: E1124 06:56:41.232551 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.232777 kubelet[2932]: I1124 06:56:41.232742 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ec589a89-1333-4d00-aa6a-417830a62536-varrun\") pod \"csi-node-driver-hsxdh\" (UID: \"ec589a89-1333-4d00-aa6a-417830a62536\") " pod="calico-system/csi-node-driver-hsxdh" Nov 24 06:56:41.233135 kubelet[2932]: E1124 06:56:41.233126 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.233220 kubelet[2932]: W1124 06:56:41.233181 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.233220 kubelet[2932]: E1124 06:56:41.233191 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.233425 kubelet[2932]: E1124 06:56:41.233407 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.233425 kubelet[2932]: W1124 06:56:41.233413 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.233425 kubelet[2932]: E1124 06:56:41.233419 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.233608 kubelet[2932]: E1124 06:56:41.233602 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.233719 kubelet[2932]: W1124 06:56:41.233673 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.233719 kubelet[2932]: E1124 06:56:41.233682 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.233864 kubelet[2932]: E1124 06:56:41.233858 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.233934 kubelet[2932]: W1124 06:56:41.233902 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.233934 kubelet[2932]: E1124 06:56:41.233927 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.234135 kubelet[2932]: E1124 06:56:41.234114 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.234135 kubelet[2932]: W1124 06:56:41.234122 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.234135 kubelet[2932]: E1124 06:56:41.234127 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.234348 kubelet[2932]: E1124 06:56:41.234341 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.234411 kubelet[2932]: W1124 06:56:41.234388 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.234411 kubelet[2932]: E1124 06:56:41.234402 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.234611 kubelet[2932]: E1124 06:56:41.234584 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.234611 kubelet[2932]: W1124 06:56:41.234591 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.234611 kubelet[2932]: E1124 06:56:41.234597 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.241486 kubelet[2932]: E1124 06:56:41.234851 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.241486 kubelet[2932]: W1124 06:56:41.234856 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.241486 kubelet[2932]: E1124 06:56:41.234862 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.241486 kubelet[2932]: E1124 06:56:41.234979 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.241486 kubelet[2932]: W1124 06:56:41.234988 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.241486 kubelet[2932]: E1124 06:56:41.234996 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.241486 kubelet[2932]: E1124 06:56:41.235090 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.241486 kubelet[2932]: W1124 06:56:41.235095 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.241486 kubelet[2932]: E1124 06:56:41.235100 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.243792 systemd[1]: Started cri-containerd-0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a.scope - libcontainer container 0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a. Nov 24 06:56:41.261701 containerd[1621]: time="2025-11-24T06:56:41.261606869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wlp6h,Uid:aa3ea5c9-d165-4def-88c8-dc67afacb115,Namespace:calico-system,Attempt:0,} returns sandbox id \"0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a\"" Nov 24 06:56:41.333373 kubelet[2932]: E1124 06:56:41.333347 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.333373 kubelet[2932]: W1124 06:56:41.333364 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.333373 kubelet[2932]: E1124 06:56:41.333379 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.333652 kubelet[2932]: E1124 06:56:41.333641 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.333652 kubelet[2932]: W1124 06:56:41.333651 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.333706 kubelet[2932]: E1124 06:56:41.333657 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.333874 kubelet[2932]: E1124 06:56:41.333864 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.333874 kubelet[2932]: W1124 06:56:41.333871 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.333920 kubelet[2932]: E1124 06:56:41.333876 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.334759 kubelet[2932]: E1124 06:56:41.333978 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.334759 kubelet[2932]: W1124 06:56:41.333984 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.334759 kubelet[2932]: E1124 06:56:41.334005 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.334759 kubelet[2932]: E1124 06:56:41.334127 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.334759 kubelet[2932]: W1124 06:56:41.334132 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.334759 kubelet[2932]: E1124 06:56:41.334157 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.334759 kubelet[2932]: E1124 06:56:41.334279 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.334759 kubelet[2932]: W1124 06:56:41.334285 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.334759 kubelet[2932]: E1124 06:56:41.334291 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.334759 kubelet[2932]: E1124 06:56:41.334414 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.335113 kubelet[2932]: W1124 06:56:41.334420 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.335113 kubelet[2932]: E1124 06:56:41.334427 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.335113 kubelet[2932]: E1124 06:56:41.334547 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.335113 kubelet[2932]: W1124 06:56:41.334552 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.335113 kubelet[2932]: E1124 06:56:41.334557 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.335113 kubelet[2932]: E1124 06:56:41.334758 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.335113 kubelet[2932]: W1124 06:56:41.334764 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.335113 kubelet[2932]: E1124 06:56:41.334769 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.335113 kubelet[2932]: E1124 06:56:41.334995 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.335113 kubelet[2932]: W1124 06:56:41.335002 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.335342 kubelet[2932]: E1124 06:56:41.335008 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.335342 kubelet[2932]: E1124 06:56:41.335104 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.335342 kubelet[2932]: W1124 06:56:41.335109 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.335342 kubelet[2932]: E1124 06:56:41.335114 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.335342 kubelet[2932]: E1124 06:56:41.335236 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.335342 kubelet[2932]: W1124 06:56:41.335241 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.335342 kubelet[2932]: E1124 06:56:41.335246 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.335493 kubelet[2932]: E1124 06:56:41.335372 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.335493 kubelet[2932]: W1124 06:56:41.335377 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.335493 kubelet[2932]: E1124 06:56:41.335382 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.336760 kubelet[2932]: E1124 06:56:41.336736 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.336760 kubelet[2932]: W1124 06:56:41.336752 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.336760 kubelet[2932]: E1124 06:56:41.336766 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.336907 kubelet[2932]: E1124 06:56:41.336894 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.336907 kubelet[2932]: W1124 06:56:41.336899 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.336907 kubelet[2932]: E1124 06:56:41.336904 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.337051 kubelet[2932]: E1124 06:56:41.337035 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.337051 kubelet[2932]: W1124 06:56:41.337042 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.337051 kubelet[2932]: E1124 06:56:41.337048 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.337227 kubelet[2932]: E1124 06:56:41.337215 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.337227 kubelet[2932]: W1124 06:56:41.337225 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.337288 kubelet[2932]: E1124 06:56:41.337233 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.337398 kubelet[2932]: E1124 06:56:41.337365 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.337398 kubelet[2932]: W1124 06:56:41.337373 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.337398 kubelet[2932]: E1124 06:56:41.337379 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.337668 kubelet[2932]: E1124 06:56:41.337500 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.337668 kubelet[2932]: W1124 06:56:41.337504 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.337668 kubelet[2932]: E1124 06:56:41.337509 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.337756 kubelet[2932]: E1124 06:56:41.337732 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.337756 kubelet[2932]: W1124 06:56:41.337737 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.337756 kubelet[2932]: E1124 06:56:41.337743 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.337954 kubelet[2932]: E1124 06:56:41.337927 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.337954 kubelet[2932]: W1124 06:56:41.337950 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.337999 kubelet[2932]: E1124 06:56:41.337956 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.338073 kubelet[2932]: E1124 06:56:41.338060 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.338073 kubelet[2932]: W1124 06:56:41.338068 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.338073 kubelet[2932]: E1124 06:56:41.338074 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.338194 kubelet[2932]: E1124 06:56:41.338183 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.338194 kubelet[2932]: W1124 06:56:41.338190 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.338269 kubelet[2932]: E1124 06:56:41.338195 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.338299 kubelet[2932]: E1124 06:56:41.338283 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.338299 kubelet[2932]: W1124 06:56:41.338287 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.338299 kubelet[2932]: E1124 06:56:41.338292 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.338401 kubelet[2932]: E1124 06:56:41.338383 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.338401 kubelet[2932]: W1124 06:56:41.338387 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.338401 kubelet[2932]: E1124 06:56:41.338391 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:41.344802 kubelet[2932]: E1124 06:56:41.344780 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:41.344802 kubelet[2932]: W1124 06:56:41.344795 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:41.344802 kubelet[2932]: E1124 06:56:41.344808 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:42.486615 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3313940395.mount: Deactivated successfully. Nov 24 06:56:43.104650 kubelet[2932]: E1124 06:56:43.104137 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:56:43.727521 containerd[1621]: time="2025-11-24T06:56:43.727486252Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:43.735661 containerd[1621]: time="2025-11-24T06:56:43.735476794Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Nov 24 06:56:43.739557 containerd[1621]: time="2025-11-24T06:56:43.739529943Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:43.748565 containerd[1621]: time="2025-11-24T06:56:43.748501320Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:43.749187 containerd[1621]: time="2025-11-24T06:56:43.748914146Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.689213199s" Nov 24 06:56:43.749187 containerd[1621]: time="2025-11-24T06:56:43.748935180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Nov 24 06:56:43.757223 containerd[1621]: time="2025-11-24T06:56:43.749811674Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Nov 24 06:56:43.778578 containerd[1621]: time="2025-11-24T06:56:43.778548968Z" level=info msg="CreateContainer within sandbox \"84341f7feb5f6c973f6a903565a66bb173ee094adb11837b6c2945149edc8d17\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Nov 24 06:56:43.811642 containerd[1621]: time="2025-11-24T06:56:43.811065454Z" level=info msg="Container 0db36b0ba7d1e27d4f8e0c0b013d1db9665c799977b2fe53bb61f889333ccbc4: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:43.812943 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3729024589.mount: Deactivated successfully. Nov 24 06:56:43.841861 containerd[1621]: time="2025-11-24T06:56:43.841825681Z" level=info msg="CreateContainer within sandbox \"84341f7feb5f6c973f6a903565a66bb173ee094adb11837b6c2945149edc8d17\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"0db36b0ba7d1e27d4f8e0c0b013d1db9665c799977b2fe53bb61f889333ccbc4\"" Nov 24 06:56:43.844656 containerd[1621]: time="2025-11-24T06:56:43.843905890Z" level=info msg="StartContainer for \"0db36b0ba7d1e27d4f8e0c0b013d1db9665c799977b2fe53bb61f889333ccbc4\"" Nov 24 06:56:43.845366 containerd[1621]: time="2025-11-24T06:56:43.845335634Z" level=info msg="connecting to shim 0db36b0ba7d1e27d4f8e0c0b013d1db9665c799977b2fe53bb61f889333ccbc4" address="unix:///run/containerd/s/98915de8db30ac52582f0bb09dffda655dd7af76a5c26c4f0a991008066a8dcb" protocol=ttrpc version=3 Nov 24 06:56:43.879798 systemd[1]: Started cri-containerd-0db36b0ba7d1e27d4f8e0c0b013d1db9665c799977b2fe53bb61f889333ccbc4.scope - libcontainer container 0db36b0ba7d1e27d4f8e0c0b013d1db9665c799977b2fe53bb61f889333ccbc4. Nov 24 06:56:43.937651 containerd[1621]: time="2025-11-24T06:56:43.937610553Z" level=info msg="StartContainer for \"0db36b0ba7d1e27d4f8e0c0b013d1db9665c799977b2fe53bb61f889333ccbc4\" returns successfully" Nov 24 06:56:44.244073 kubelet[2932]: E1124 06:56:44.243929 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.244073 kubelet[2932]: W1124 06:56:44.243944 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.251639 kubelet[2932]: E1124 06:56:44.250802 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.251945 kubelet[2932]: E1124 06:56:44.251887 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.251945 kubelet[2932]: W1124 06:56:44.251901 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.251945 kubelet[2932]: E1124 06:56:44.251915 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.252139 kubelet[2932]: E1124 06:56:44.252102 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.252139 kubelet[2932]: W1124 06:56:44.252108 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.252139 kubelet[2932]: E1124 06:56:44.252114 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.252377 kubelet[2932]: E1124 06:56:44.252332 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.252377 kubelet[2932]: W1124 06:56:44.252338 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.252377 kubelet[2932]: E1124 06:56:44.252345 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.252568 kubelet[2932]: E1124 06:56:44.252526 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.252568 kubelet[2932]: W1124 06:56:44.252532 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.252568 kubelet[2932]: E1124 06:56:44.252548 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.252805 kubelet[2932]: E1124 06:56:44.252771 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.252805 kubelet[2932]: W1124 06:56:44.252778 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.252805 kubelet[2932]: E1124 06:56:44.252784 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.254679 kubelet[2932]: E1124 06:56:44.253132 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.254679 kubelet[2932]: W1124 06:56:44.253139 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.254679 kubelet[2932]: E1124 06:56:44.253148 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.254679 kubelet[2932]: E1124 06:56:44.253235 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.254679 kubelet[2932]: W1124 06:56:44.253241 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.254679 kubelet[2932]: E1124 06:56:44.253246 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.254679 kubelet[2932]: E1124 06:56:44.253333 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.254679 kubelet[2932]: W1124 06:56:44.253337 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.254679 kubelet[2932]: E1124 06:56:44.253342 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.254679 kubelet[2932]: E1124 06:56:44.253426 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.254911 kubelet[2932]: W1124 06:56:44.253431 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.254911 kubelet[2932]: E1124 06:56:44.253436 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.254911 kubelet[2932]: E1124 06:56:44.253550 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.254911 kubelet[2932]: W1124 06:56:44.253554 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.254911 kubelet[2932]: E1124 06:56:44.253559 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.254911 kubelet[2932]: E1124 06:56:44.253686 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.254911 kubelet[2932]: W1124 06:56:44.253691 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.254911 kubelet[2932]: E1124 06:56:44.253696 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.255432 kubelet[2932]: E1124 06:56:44.255316 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.255432 kubelet[2932]: W1124 06:56:44.255334 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.255432 kubelet[2932]: E1124 06:56:44.255354 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.255781 kubelet[2932]: E1124 06:56:44.255593 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.255781 kubelet[2932]: W1124 06:56:44.255645 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.255781 kubelet[2932]: E1124 06:56:44.255659 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.256330 kubelet[2932]: E1124 06:56:44.256196 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.256330 kubelet[2932]: W1124 06:56:44.256299 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.256330 kubelet[2932]: E1124 06:56:44.256313 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.258352 kubelet[2932]: E1124 06:56:44.258145 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.258352 kubelet[2932]: W1124 06:56:44.258165 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.258352 kubelet[2932]: E1124 06:56:44.258183 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.259301 kubelet[2932]: E1124 06:56:44.259252 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.259301 kubelet[2932]: W1124 06:56:44.259269 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.259301 kubelet[2932]: E1124 06:56:44.259288 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.259683 kubelet[2932]: E1124 06:56:44.259651 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.259683 kubelet[2932]: W1124 06:56:44.259661 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.259683 kubelet[2932]: E1124 06:56:44.259673 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.260302 kubelet[2932]: E1124 06:56:44.260186 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.260302 kubelet[2932]: W1124 06:56:44.260198 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.260302 kubelet[2932]: E1124 06:56:44.260209 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.260762 kubelet[2932]: E1124 06:56:44.260716 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.260762 kubelet[2932]: W1124 06:56:44.260729 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.260762 kubelet[2932]: E1124 06:56:44.260742 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.261576 kubelet[2932]: E1124 06:56:44.261563 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.261676 kubelet[2932]: W1124 06:56:44.261646 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.261676 kubelet[2932]: E1124 06:56:44.261664 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.261950 kubelet[2932]: E1124 06:56:44.261920 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.261950 kubelet[2932]: W1124 06:56:44.261931 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.261950 kubelet[2932]: E1124 06:56:44.261941 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.262637 kubelet[2932]: E1124 06:56:44.262189 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.262637 kubelet[2932]: W1124 06:56:44.262197 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.262637 kubelet[2932]: E1124 06:56:44.262205 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.262957 kubelet[2932]: E1124 06:56:44.262921 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.262957 kubelet[2932]: W1124 06:56:44.262935 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.262957 kubelet[2932]: E1124 06:56:44.262946 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.263336 kubelet[2932]: E1124 06:56:44.263251 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.263336 kubelet[2932]: W1124 06:56:44.263266 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.263336 kubelet[2932]: E1124 06:56:44.263276 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.263649 kubelet[2932]: E1124 06:56:44.263505 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.263649 kubelet[2932]: W1124 06:56:44.263514 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.263649 kubelet[2932]: E1124 06:56:44.263523 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.264018 kubelet[2932]: E1124 06:56:44.263987 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.264018 kubelet[2932]: W1124 06:56:44.263998 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.264963 kubelet[2932]: E1124 06:56:44.264869 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.265202 kubelet[2932]: E1124 06:56:44.265192 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.265349 kubelet[2932]: W1124 06:56:44.265338 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.265446 kubelet[2932]: E1124 06:56:44.265412 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.265802 kubelet[2932]: E1124 06:56:44.265754 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.266009 kubelet[2932]: W1124 06:56:44.265882 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.266009 kubelet[2932]: E1124 06:56:44.265898 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.266398 kubelet[2932]: E1124 06:56:44.266366 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.266398 kubelet[2932]: W1124 06:56:44.266377 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.266398 kubelet[2932]: E1124 06:56:44.266388 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.269756 kubelet[2932]: E1124 06:56:44.267339 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.269756 kubelet[2932]: W1124 06:56:44.267349 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.269756 kubelet[2932]: E1124 06:56:44.267361 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.269756 kubelet[2932]: E1124 06:56:44.267517 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.269756 kubelet[2932]: W1124 06:56:44.267525 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.269756 kubelet[2932]: E1124 06:56:44.267533 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:44.269756 kubelet[2932]: E1124 06:56:44.267906 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:44.269756 kubelet[2932]: W1124 06:56:44.267914 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:44.269756 kubelet[2932]: E1124 06:56:44.267923 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.104548 kubelet[2932]: E1124 06:56:45.104306 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:56:45.180427 kubelet[2932]: I1124 06:56:45.180398 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 06:56:45.263643 kubelet[2932]: E1124 06:56:45.263073 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.263643 kubelet[2932]: W1124 06:56:45.263089 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.263643 kubelet[2932]: E1124 06:56:45.263104 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.263643 kubelet[2932]: E1124 06:56:45.263241 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.263643 kubelet[2932]: W1124 06:56:45.263247 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.263643 kubelet[2932]: E1124 06:56:45.263254 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.263643 kubelet[2932]: E1124 06:56:45.263385 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.263643 kubelet[2932]: W1124 06:56:45.263391 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.263643 kubelet[2932]: E1124 06:56:45.263397 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.263643 kubelet[2932]: E1124 06:56:45.263514 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.264098 kubelet[2932]: W1124 06:56:45.263534 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.264098 kubelet[2932]: E1124 06:56:45.263542 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.264098 kubelet[2932]: E1124 06:56:45.263704 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.264098 kubelet[2932]: W1124 06:56:45.263711 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.264098 kubelet[2932]: E1124 06:56:45.263718 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.264098 kubelet[2932]: E1124 06:56:45.263890 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.264098 kubelet[2932]: W1124 06:56:45.263897 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.264098 kubelet[2932]: E1124 06:56:45.263903 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.264098 kubelet[2932]: E1124 06:56:45.264095 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.264098 kubelet[2932]: W1124 06:56:45.264101 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.264309 kubelet[2932]: E1124 06:56:45.264107 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.264309 kubelet[2932]: E1124 06:56:45.264233 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.264309 kubelet[2932]: W1124 06:56:45.264239 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.264309 kubelet[2932]: E1124 06:56:45.264245 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.264391 kubelet[2932]: E1124 06:56:45.264362 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.264391 kubelet[2932]: W1124 06:56:45.264367 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.264391 kubelet[2932]: E1124 06:56:45.264373 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.265633 kubelet[2932]: E1124 06:56:45.264499 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.265633 kubelet[2932]: W1124 06:56:45.264508 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.265633 kubelet[2932]: E1124 06:56:45.264513 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.265633 kubelet[2932]: E1124 06:56:45.264701 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.265633 kubelet[2932]: W1124 06:56:45.264707 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.265633 kubelet[2932]: E1124 06:56:45.264714 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.265633 kubelet[2932]: E1124 06:56:45.264880 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.265633 kubelet[2932]: W1124 06:56:45.264886 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.265633 kubelet[2932]: E1124 06:56:45.264892 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.265633 kubelet[2932]: E1124 06:56:45.265033 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.265859 kubelet[2932]: W1124 06:56:45.265054 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.265859 kubelet[2932]: E1124 06:56:45.265060 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.265859 kubelet[2932]: E1124 06:56:45.265190 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.265859 kubelet[2932]: W1124 06:56:45.265196 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.265859 kubelet[2932]: E1124 06:56:45.265201 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.265859 kubelet[2932]: E1124 06:56:45.265306 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.265859 kubelet[2932]: W1124 06:56:45.265316 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.265859 kubelet[2932]: E1124 06:56:45.265322 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.266538 kubelet[2932]: E1124 06:56:45.266428 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.266538 kubelet[2932]: W1124 06:56:45.266437 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.266538 kubelet[2932]: E1124 06:56:45.266444 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.266618 kubelet[2932]: E1124 06:56:45.266555 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.266618 kubelet[2932]: W1124 06:56:45.266561 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.266618 kubelet[2932]: E1124 06:56:45.266568 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.266731 kubelet[2932]: E1124 06:56:45.266714 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.266731 kubelet[2932]: W1124 06:56:45.266723 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.266731 kubelet[2932]: E1124 06:56:45.266730 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.266929 kubelet[2932]: E1124 06:56:45.266874 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.266929 kubelet[2932]: W1124 06:56:45.266882 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.266929 kubelet[2932]: E1124 06:56:45.266888 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.267011 kubelet[2932]: E1124 06:56:45.266999 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.267011 kubelet[2932]: W1124 06:56:45.267004 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.267011 kubelet[2932]: E1124 06:56:45.267010 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.267238 kubelet[2932]: E1124 06:56:45.267126 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.267238 kubelet[2932]: W1124 06:56:45.267135 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.267238 kubelet[2932]: E1124 06:56:45.267143 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.267319 kubelet[2932]: E1124 06:56:45.267268 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.267319 kubelet[2932]: W1124 06:56:45.267273 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.267319 kubelet[2932]: E1124 06:56:45.267279 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.274494 kubelet[2932]: E1124 06:56:45.267565 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.274494 kubelet[2932]: W1124 06:56:45.267575 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.274494 kubelet[2932]: E1124 06:56:45.267584 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.274494 kubelet[2932]: E1124 06:56:45.268009 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.274494 kubelet[2932]: W1124 06:56:45.268016 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.274494 kubelet[2932]: E1124 06:56:45.268024 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.274494 kubelet[2932]: E1124 06:56:45.268139 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.274494 kubelet[2932]: W1124 06:56:45.268145 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.274494 kubelet[2932]: E1124 06:56:45.268152 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.274494 kubelet[2932]: E1124 06:56:45.268284 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.274783 kubelet[2932]: W1124 06:56:45.268290 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.274783 kubelet[2932]: E1124 06:56:45.268296 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.274783 kubelet[2932]: E1124 06:56:45.268477 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.274783 kubelet[2932]: W1124 06:56:45.268484 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.274783 kubelet[2932]: E1124 06:56:45.268491 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.274783 kubelet[2932]: E1124 06:56:45.268721 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.274783 kubelet[2932]: W1124 06:56:45.268728 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.274783 kubelet[2932]: E1124 06:56:45.268735 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.274783 kubelet[2932]: E1124 06:56:45.268848 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.274783 kubelet[2932]: W1124 06:56:45.268854 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.275081 kubelet[2932]: E1124 06:56:45.268883 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.275081 kubelet[2932]: E1124 06:56:45.269010 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.275081 kubelet[2932]: W1124 06:56:45.269016 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.275081 kubelet[2932]: E1124 06:56:45.269022 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.275081 kubelet[2932]: E1124 06:56:45.269175 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.275081 kubelet[2932]: W1124 06:56:45.269181 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.275081 kubelet[2932]: E1124 06:56:45.269188 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.275081 kubelet[2932]: E1124 06:56:45.269346 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.275081 kubelet[2932]: W1124 06:56:45.269351 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.275081 kubelet[2932]: E1124 06:56:45.269357 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.275485 kubelet[2932]: E1124 06:56:45.269584 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Nov 24 06:56:45.275485 kubelet[2932]: W1124 06:56:45.269591 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Nov 24 06:56:45.275485 kubelet[2932]: E1124 06:56:45.269599 2932 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Nov 24 06:56:45.351115 containerd[1621]: time="2025-11-24T06:56:45.351087903Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:45.351483 containerd[1621]: time="2025-11-24T06:56:45.351422354Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Nov 24 06:56:45.351825 containerd[1621]: time="2025-11-24T06:56:45.351802337Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:45.352875 containerd[1621]: time="2025-11-24T06:56:45.352852413Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:45.353277 containerd[1621]: time="2025-11-24T06:56:45.353259048Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.603428072s" Nov 24 06:56:45.353303 containerd[1621]: time="2025-11-24T06:56:45.353278982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Nov 24 06:56:45.406898 containerd[1621]: time="2025-11-24T06:56:45.406841454Z" level=info msg="CreateContainer within sandbox \"0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Nov 24 06:56:45.419989 containerd[1621]: time="2025-11-24T06:56:45.419957418Z" level=info msg="Container b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:45.427902 containerd[1621]: time="2025-11-24T06:56:45.427869474Z" level=info msg="CreateContainer within sandbox \"0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169\"" Nov 24 06:56:45.428728 containerd[1621]: time="2025-11-24T06:56:45.428600840Z" level=info msg="StartContainer for \"b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169\"" Nov 24 06:56:45.431231 containerd[1621]: time="2025-11-24T06:56:45.431130355Z" level=info msg="connecting to shim b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169" address="unix:///run/containerd/s/1c986947fb8b3e68d47495aa38e566039470b9d2f78d5b20f764c4e1208381f5" protocol=ttrpc version=3 Nov 24 06:56:45.455747 systemd[1]: Started cri-containerd-b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169.scope - libcontainer container b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169. Nov 24 06:56:45.510931 containerd[1621]: time="2025-11-24T06:56:45.510895360Z" level=info msg="StartContainer for \"b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169\" returns successfully" Nov 24 06:56:45.525210 systemd[1]: cri-containerd-b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169.scope: Deactivated successfully. Nov 24 06:56:45.553478 containerd[1621]: time="2025-11-24T06:56:45.553365447Z" level=info msg="received container exit event container_id:\"b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169\" id:\"b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169\" pid:3644 exited_at:{seconds:1763967405 nanos:526302216}" Nov 24 06:56:45.604417 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b8dd8b3f61f55826a1cfca4e5c661df71427bca4d79e62e767f0ec57338a1169-rootfs.mount: Deactivated successfully. Nov 24 06:56:46.179286 containerd[1621]: time="2025-11-24T06:56:46.179252610Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Nov 24 06:56:46.204141 kubelet[2932]: I1124 06:56:46.204096 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-bc5d8c879-qkp8f" podStartSLOduration=3.513210973 podStartE2EDuration="6.203395947s" podCreationTimestamp="2025-11-24 06:56:40 +0000 UTC" firstStartedPulling="2025-11-24 06:56:41.059281813 +0000 UTC m=+20.074353613" lastFinishedPulling="2025-11-24 06:56:43.749466784 +0000 UTC m=+22.764538587" observedRunningTime="2025-11-24 06:56:44.227799608 +0000 UTC m=+23.242871413" watchObservedRunningTime="2025-11-24 06:56:46.203395947 +0000 UTC m=+25.218467754" Nov 24 06:56:47.104867 kubelet[2932]: E1124 06:56:47.104813 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:56:49.105063 kubelet[2932]: E1124 06:56:49.105033 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:56:50.118396 containerd[1621]: time="2025-11-24T06:56:50.118367599Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:50.125913 containerd[1621]: time="2025-11-24T06:56:50.125883864Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Nov 24 06:56:50.132840 containerd[1621]: time="2025-11-24T06:56:50.132817518Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:50.138608 containerd[1621]: time="2025-11-24T06:56:50.138559160Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:50.139275 containerd[1621]: time="2025-11-24T06:56:50.138865310Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.959585595s" Nov 24 06:56:50.139275 containerd[1621]: time="2025-11-24T06:56:50.138885866Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Nov 24 06:56:50.155146 containerd[1621]: time="2025-11-24T06:56:50.155118802Z" level=info msg="CreateContainer within sandbox \"0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Nov 24 06:56:50.184097 containerd[1621]: time="2025-11-24T06:56:50.182829466Z" level=info msg="Container 42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:50.195161 containerd[1621]: time="2025-11-24T06:56:50.195120722Z" level=info msg="CreateContainer within sandbox \"0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1\"" Nov 24 06:56:50.195860 containerd[1621]: time="2025-11-24T06:56:50.195816639Z" level=info msg="StartContainer for \"42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1\"" Nov 24 06:56:50.197120 containerd[1621]: time="2025-11-24T06:56:50.197095148Z" level=info msg="connecting to shim 42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1" address="unix:///run/containerd/s/1c986947fb8b3e68d47495aa38e566039470b9d2f78d5b20f764c4e1208381f5" protocol=ttrpc version=3 Nov 24 06:56:50.217757 systemd[1]: Started cri-containerd-42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1.scope - libcontainer container 42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1. Nov 24 06:56:50.292510 containerd[1621]: time="2025-11-24T06:56:50.292428817Z" level=info msg="StartContainer for \"42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1\" returns successfully" Nov 24 06:56:51.197640 kubelet[2932]: E1124 06:56:51.197490 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:56:51.841815 systemd[1]: cri-containerd-42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1.scope: Deactivated successfully. Nov 24 06:56:51.842066 systemd[1]: cri-containerd-42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1.scope: Consumed 332ms CPU time, 161.5M memory peak, 2.6M read from disk, 171.3M written to disk. Nov 24 06:56:51.886866 containerd[1621]: time="2025-11-24T06:56:51.886833506Z" level=info msg="received container exit event container_id:\"42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1\" id:\"42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1\" pid:3706 exited_at:{seconds:1763967411 nanos:875918516}" Nov 24 06:56:51.916427 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-42d8abc8567affcdeb7d9c00ad3f2b655684f7a4ff12cdbdd75e054ce69478b1-rootfs.mount: Deactivated successfully. Nov 24 06:56:51.918435 kubelet[2932]: I1124 06:56:51.918406 2932 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Nov 24 06:56:51.963099 systemd[1]: Created slice kubepods-burstable-pod7332a908_d485_404d_91ee_a472cfca4232.slice - libcontainer container kubepods-burstable-pod7332a908_d485_404d_91ee_a472cfca4232.slice. Nov 24 06:56:51.974870 systemd[1]: Created slice kubepods-burstable-podc9b82770_9b92_4ba0_9d3c_7b0c3edc869c.slice - libcontainer container kubepods-burstable-podc9b82770_9b92_4ba0_9d3c_7b0c3edc869c.slice. Nov 24 06:56:51.985610 systemd[1]: Created slice kubepods-besteffort-pod77c9c1c2_ff92_4d21_b427_835b49d2e048.slice - libcontainer container kubepods-besteffort-pod77c9c1c2_ff92_4d21_b427_835b49d2e048.slice. Nov 24 06:56:51.992863 systemd[1]: Created slice kubepods-besteffort-pod8655a062_8ee8_4565_9af9_1c36ab263987.slice - libcontainer container kubepods-besteffort-pod8655a062_8ee8_4565_9af9_1c36ab263987.slice. Nov 24 06:56:51.997589 systemd[1]: Created slice kubepods-besteffort-poda8c5d990_4028_4565_8488_4dcc003e63da.slice - libcontainer container kubepods-besteffort-poda8c5d990_4028_4565_8488_4dcc003e63da.slice. Nov 24 06:56:52.002696 systemd[1]: Created slice kubepods-besteffort-pod3079118d_9876_4056_a671_92f88f5f8c3d.slice - libcontainer container kubepods-besteffort-pod3079118d_9876_4056_a671_92f88f5f8c3d.slice. Nov 24 06:56:52.007731 systemd[1]: Created slice kubepods-besteffort-pode9872522_30cd_4303_b0f4_9d477ec17bc5.slice - libcontainer container kubepods-besteffort-pode9872522_30cd_4303_b0f4_9d477ec17bc5.slice. Nov 24 06:56:52.011863 systemd[1]: Created slice kubepods-besteffort-pod56a1253d_b0f7_4032_98a1_7eca8d8f6d62.slice - libcontainer container kubepods-besteffort-pod56a1253d_b0f7_4032_98a1_7eca8d8f6d62.slice. Nov 24 06:56:52.065713 kubelet[2932]: I1124 06:56:52.065665 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74s7f\" (UniqueName: \"kubernetes.io/projected/77c9c1c2-ff92-4d21-b427-835b49d2e048-kube-api-access-74s7f\") pod \"calico-apiserver-7c65846d8b-n55vh\" (UID: \"77c9c1c2-ff92-4d21-b427-835b49d2e048\") " pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" Nov 24 06:56:52.065713 kubelet[2932]: I1124 06:56:52.065694 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9b82770-9b92-4ba0-9d3c-7b0c3edc869c-config-volume\") pod \"coredns-674b8bbfcf-5jnz6\" (UID: \"c9b82770-9b92-4ba0-9d3c-7b0c3edc869c\") " pod="kube-system/coredns-674b8bbfcf-5jnz6" Nov 24 06:56:52.065946 kubelet[2932]: I1124 06:56:52.065870 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7332a908-d485-404d-91ee-a472cfca4232-config-volume\") pod \"coredns-674b8bbfcf-gkfn4\" (UID: \"7332a908-d485-404d-91ee-a472cfca4232\") " pod="kube-system/coredns-674b8bbfcf-gkfn4" Nov 24 06:56:52.065946 kubelet[2932]: I1124 06:56:52.065885 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jpqh\" (UniqueName: \"kubernetes.io/projected/7332a908-d485-404d-91ee-a472cfca4232-kube-api-access-2jpqh\") pod \"coredns-674b8bbfcf-gkfn4\" (UID: \"7332a908-d485-404d-91ee-a472cfca4232\") " pod="kube-system/coredns-674b8bbfcf-gkfn4" Nov 24 06:56:52.065946 kubelet[2932]: I1124 06:56:52.065895 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8655a062-8ee8-4565-9af9-1c36ab263987-calico-apiserver-certs\") pod \"calico-apiserver-6757d5779b-2xfwn\" (UID: \"8655a062-8ee8-4565-9af9-1c36ab263987\") " pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" Nov 24 06:56:52.065946 kubelet[2932]: I1124 06:56:52.065906 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/56a1253d-b0f7-4032-98a1-7eca8d8f6d62-calico-apiserver-certs\") pod \"calico-apiserver-7c65846d8b-csbwd\" (UID: \"56a1253d-b0f7-4032-98a1-7eca8d8f6d62\") " pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" Nov 24 06:56:52.066136 kubelet[2932]: I1124 06:56:52.066014 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3079118d-9876-4056-a671-92f88f5f8c3d-goldmane-ca-bundle\") pod \"goldmane-666569f655-swsqx\" (UID: \"3079118d-9876-4056-a671-92f88f5f8c3d\") " pod="calico-system/goldmane-666569f655-swsqx" Nov 24 06:56:52.066136 kubelet[2932]: I1124 06:56:52.066037 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfbjn\" (UniqueName: \"kubernetes.io/projected/c9b82770-9b92-4ba0-9d3c-7b0c3edc869c-kube-api-access-mfbjn\") pod \"coredns-674b8bbfcf-5jnz6\" (UID: \"c9b82770-9b92-4ba0-9d3c-7b0c3edc869c\") " pod="kube-system/coredns-674b8bbfcf-5jnz6" Nov 24 06:56:52.066136 kubelet[2932]: I1124 06:56:52.066047 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a8c5d990-4028-4565-8488-4dcc003e63da-whisker-backend-key-pair\") pod \"whisker-79966f5546-bkgtb\" (UID: \"a8c5d990-4028-4565-8488-4dcc003e63da\") " pod="calico-system/whisker-79966f5546-bkgtb" Nov 24 06:56:52.066136 kubelet[2932]: I1124 06:56:52.066055 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kccq\" (UniqueName: \"kubernetes.io/projected/a8c5d990-4028-4565-8488-4dcc003e63da-kube-api-access-8kccq\") pod \"whisker-79966f5546-bkgtb\" (UID: \"a8c5d990-4028-4565-8488-4dcc003e63da\") " pod="calico-system/whisker-79966f5546-bkgtb" Nov 24 06:56:52.066136 kubelet[2932]: I1124 06:56:52.066066 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3079118d-9876-4056-a671-92f88f5f8c3d-config\") pod \"goldmane-666569f655-swsqx\" (UID: \"3079118d-9876-4056-a671-92f88f5f8c3d\") " pod="calico-system/goldmane-666569f655-swsqx" Nov 24 06:56:52.071528 kubelet[2932]: I1124 06:56:52.066077 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3079118d-9876-4056-a671-92f88f5f8c3d-goldmane-key-pair\") pod \"goldmane-666569f655-swsqx\" (UID: \"3079118d-9876-4056-a671-92f88f5f8c3d\") " pod="calico-system/goldmane-666569f655-swsqx" Nov 24 06:56:52.071528 kubelet[2932]: I1124 06:56:52.066347 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzjhk\" (UniqueName: \"kubernetes.io/projected/e9872522-30cd-4303-b0f4-9d477ec17bc5-kube-api-access-qzjhk\") pod \"calico-kube-controllers-79b4855f45-7htjs\" (UID: \"e9872522-30cd-4303-b0f4-9d477ec17bc5\") " pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" Nov 24 06:56:52.071528 kubelet[2932]: I1124 06:56:52.066362 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/77c9c1c2-ff92-4d21-b427-835b49d2e048-calico-apiserver-certs\") pod \"calico-apiserver-7c65846d8b-n55vh\" (UID: \"77c9c1c2-ff92-4d21-b427-835b49d2e048\") " pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" Nov 24 06:56:52.071528 kubelet[2932]: I1124 06:56:52.066372 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wctht\" (UniqueName: \"kubernetes.io/projected/3079118d-9876-4056-a671-92f88f5f8c3d-kube-api-access-wctht\") pod \"goldmane-666569f655-swsqx\" (UID: \"3079118d-9876-4056-a671-92f88f5f8c3d\") " pod="calico-system/goldmane-666569f655-swsqx" Nov 24 06:56:52.071528 kubelet[2932]: I1124 06:56:52.066415 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8c5d990-4028-4565-8488-4dcc003e63da-whisker-ca-bundle\") pod \"whisker-79966f5546-bkgtb\" (UID: \"a8c5d990-4028-4565-8488-4dcc003e63da\") " pod="calico-system/whisker-79966f5546-bkgtb" Nov 24 06:56:52.076449 kubelet[2932]: I1124 06:56:52.066427 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ck7\" (UniqueName: \"kubernetes.io/projected/8655a062-8ee8-4565-9af9-1c36ab263987-kube-api-access-p9ck7\") pod \"calico-apiserver-6757d5779b-2xfwn\" (UID: \"8655a062-8ee8-4565-9af9-1c36ab263987\") " pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" Nov 24 06:56:52.076449 kubelet[2932]: I1124 06:56:52.066438 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9872522-30cd-4303-b0f4-9d477ec17bc5-tigera-ca-bundle\") pod \"calico-kube-controllers-79b4855f45-7htjs\" (UID: \"e9872522-30cd-4303-b0f4-9d477ec17bc5\") " pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" Nov 24 06:56:52.076449 kubelet[2932]: I1124 06:56:52.066447 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kszgb\" (UniqueName: \"kubernetes.io/projected/56a1253d-b0f7-4032-98a1-7eca8d8f6d62-kube-api-access-kszgb\") pod \"calico-apiserver-7c65846d8b-csbwd\" (UID: \"56a1253d-b0f7-4032-98a1-7eca8d8f6d62\") " pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" Nov 24 06:56:52.278005 containerd[1621]: time="2025-11-24T06:56:52.277919235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gkfn4,Uid:7332a908-d485-404d-91ee-a472cfca4232,Namespace:kube-system,Attempt:0,}" Nov 24 06:56:52.280812 containerd[1621]: time="2025-11-24T06:56:52.280787962Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5jnz6,Uid:c9b82770-9b92-4ba0-9d3c-7b0c3edc869c,Namespace:kube-system,Attempt:0,}" Nov 24 06:56:52.284414 containerd[1621]: time="2025-11-24T06:56:52.284387843Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Nov 24 06:56:52.298019 containerd[1621]: time="2025-11-24T06:56:52.297989110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c65846d8b-n55vh,Uid:77c9c1c2-ff92-4d21-b427-835b49d2e048,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:56:52.298148 containerd[1621]: time="2025-11-24T06:56:52.298132717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6757d5779b-2xfwn,Uid:8655a062-8ee8-4565-9af9-1c36ab263987,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:56:52.302413 containerd[1621]: time="2025-11-24T06:56:52.302385158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79966f5546-bkgtb,Uid:a8c5d990-4028-4565-8488-4dcc003e63da,Namespace:calico-system,Attempt:0,}" Nov 24 06:56:52.305217 containerd[1621]: time="2025-11-24T06:56:52.305198902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-swsqx,Uid:3079118d-9876-4056-a671-92f88f5f8c3d,Namespace:calico-system,Attempt:0,}" Nov 24 06:56:52.310818 containerd[1621]: time="2025-11-24T06:56:52.310793371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79b4855f45-7htjs,Uid:e9872522-30cd-4303-b0f4-9d477ec17bc5,Namespace:calico-system,Attempt:0,}" Nov 24 06:56:52.318284 containerd[1621]: time="2025-11-24T06:56:52.317072056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c65846d8b-csbwd,Uid:56a1253d-b0f7-4032-98a1-7eca8d8f6d62,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:56:53.095593 containerd[1621]: time="2025-11-24T06:56:53.095558575Z" level=error msg="Failed to destroy network for sandbox \"9eef19a7eda609ae2b8fe23f3a055e18945c58092e9906479f7b6a8cc23f95e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.097071 systemd[1]: run-netns-cni\x2de3cd489a\x2d1b8b\x2d4d33\x2dcb19\x2d303dba4b9a5e.mount: Deactivated successfully. Nov 24 06:56:53.101200 systemd[1]: run-netns-cni\x2debd4c2d2\x2d8582\x2d7e90\x2de4e5\x2dd3d45cdccace.mount: Deactivated successfully. Nov 24 06:56:53.108716 containerd[1621]: time="2025-11-24T06:56:53.097656994Z" level=error msg="Failed to destroy network for sandbox \"5eedade5796b0190a27a16903973e973feea76b403da529e142640f309113da0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.108716 containerd[1621]: time="2025-11-24T06:56:53.103637861Z" level=error msg="Failed to destroy network for sandbox \"dd407c41ae0c825ae60aedde51d1ede3f577a90f65e6d1ca809932d7251c0551\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.108716 containerd[1621]: time="2025-11-24T06:56:53.108070633Z" level=error msg="Failed to destroy network for sandbox \"46d943683baab63567f337679ceda9224f8a59033784d248eace97876bc0e735\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.106327 systemd[1]: run-netns-cni\x2d779456db\x2d02dc\x2d324d\x2d9edd\x2d4582ebbe8dee.mount: Deactivated successfully. Nov 24 06:56:53.109427 systemd[1]: run-netns-cni\x2d0c869a6c\x2db5ee\x2d3602\x2d70b9\x2d2e7b1ee0db0b.mount: Deactivated successfully. Nov 24 06:56:53.112885 containerd[1621]: time="2025-11-24T06:56:53.109605935Z" level=error msg="Failed to destroy network for sandbox \"c17f369173b2481b565b4ba02688297b1c9730f11fc0a930866fed2a3e165228\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.115577 containerd[1621]: time="2025-11-24T06:56:53.110937006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c65846d8b-n55vh,Uid:77c9c1c2-ff92-4d21-b427-835b49d2e048,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9eef19a7eda609ae2b8fe23f3a055e18945c58092e9906479f7b6a8cc23f95e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.115825 systemd[1]: Created slice kubepods-besteffort-podec589a89_1333_4d00_aa6a_417830a62536.slice - libcontainer container kubepods-besteffort-podec589a89_1333_4d00_aa6a_417830a62536.slice. Nov 24 06:56:53.121698 containerd[1621]: time="2025-11-24T06:56:53.116523845Z" level=error msg="Failed to destroy network for sandbox \"8e139b8cd425779706858c70ddce5a93c9a1e5c2b01b385a03b46cf72cfa523e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.121698 containerd[1621]: time="2025-11-24T06:56:53.116756726Z" level=error msg="Failed to destroy network for sandbox \"d4cfea463e24d9524b610a278803ec96ef52d7c73e2ed17dd1ec91efd07b3dc9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.121698 containerd[1621]: time="2025-11-24T06:56:53.111942618Z" level=error msg="Failed to destroy network for sandbox \"fe63cd797dbe24b86ccc2718e766ddd7bc1531dedb621a8170d7f1af0b860dee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.121698 containerd[1621]: time="2025-11-24T06:56:53.119978059Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6757d5779b-2xfwn,Uid:8655a062-8ee8-4565-9af9-1c36ab263987,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5eedade5796b0190a27a16903973e973feea76b403da529e142640f309113da0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.136030 kubelet[2932]: E1124 06:56:53.135811 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9eef19a7eda609ae2b8fe23f3a055e18945c58092e9906479f7b6a8cc23f95e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.136030 kubelet[2932]: E1124 06:56:53.135857 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9eef19a7eda609ae2b8fe23f3a055e18945c58092e9906479f7b6a8cc23f95e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" Nov 24 06:56:53.136030 kubelet[2932]: E1124 06:56:53.135879 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9eef19a7eda609ae2b8fe23f3a055e18945c58092e9906479f7b6a8cc23f95e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" Nov 24 06:56:53.136397 kubelet[2932]: E1124 06:56:53.135918 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7c65846d8b-n55vh_calico-apiserver(77c9c1c2-ff92-4d21-b427-835b49d2e048)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7c65846d8b-n55vh_calico-apiserver(77c9c1c2-ff92-4d21-b427-835b49d2e048)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9eef19a7eda609ae2b8fe23f3a055e18945c58092e9906479f7b6a8cc23f95e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:56:53.159767 containerd[1621]: time="2025-11-24T06:56:53.159726346Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79966f5546-bkgtb,Uid:a8c5d990-4028-4565-8488-4dcc003e63da,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd407c41ae0c825ae60aedde51d1ede3f577a90f65e6d1ca809932d7251c0551\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.160475 containerd[1621]: time="2025-11-24T06:56:53.160461459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hsxdh,Uid:ec589a89-1333-4d00-aa6a-417830a62536,Namespace:calico-system,Attempt:0,}" Nov 24 06:56:53.160601 kubelet[2932]: E1124 06:56:53.160556 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5eedade5796b0190a27a16903973e973feea76b403da529e142640f309113da0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.160743 kubelet[2932]: E1124 06:56:53.160610 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5eedade5796b0190a27a16903973e973feea76b403da529e142640f309113da0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" Nov 24 06:56:53.160743 kubelet[2932]: E1124 06:56:53.160639 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5eedade5796b0190a27a16903973e973feea76b403da529e142640f309113da0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" Nov 24 06:56:53.160743 kubelet[2932]: E1124 06:56:53.160679 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6757d5779b-2xfwn_calico-apiserver(8655a062-8ee8-4565-9af9-1c36ab263987)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6757d5779b-2xfwn_calico-apiserver(8655a062-8ee8-4565-9af9-1c36ab263987)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5eedade5796b0190a27a16903973e973feea76b403da529e142640f309113da0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:56:53.161031 kubelet[2932]: E1124 06:56:53.161012 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd407c41ae0c825ae60aedde51d1ede3f577a90f65e6d1ca809932d7251c0551\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.161060 kubelet[2932]: E1124 06:56:53.161033 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd407c41ae0c825ae60aedde51d1ede3f577a90f65e6d1ca809932d7251c0551\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-79966f5546-bkgtb" Nov 24 06:56:53.161060 kubelet[2932]: E1124 06:56:53.161045 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd407c41ae0c825ae60aedde51d1ede3f577a90f65e6d1ca809932d7251c0551\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-79966f5546-bkgtb" Nov 24 06:56:53.161100 kubelet[2932]: E1124 06:56:53.161066 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-79966f5546-bkgtb_calico-system(a8c5d990-4028-4565-8488-4dcc003e63da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-79966f5546-bkgtb_calico-system(a8c5d990-4028-4565-8488-4dcc003e63da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd407c41ae0c825ae60aedde51d1ede3f577a90f65e6d1ca809932d7251c0551\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-79966f5546-bkgtb" podUID="a8c5d990-4028-4565-8488-4dcc003e63da" Nov 24 06:56:53.175845 containerd[1621]: time="2025-11-24T06:56:53.175736068Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gkfn4,Uid:7332a908-d485-404d-91ee-a472cfca4232,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c17f369173b2481b565b4ba02688297b1c9730f11fc0a930866fed2a3e165228\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.176335 kubelet[2932]: E1124 06:56:53.176239 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c17f369173b2481b565b4ba02688297b1c9730f11fc0a930866fed2a3e165228\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.176335 kubelet[2932]: E1124 06:56:53.176270 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c17f369173b2481b565b4ba02688297b1c9730f11fc0a930866fed2a3e165228\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gkfn4" Nov 24 06:56:53.176335 kubelet[2932]: E1124 06:56:53.176289 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c17f369173b2481b565b4ba02688297b1c9730f11fc0a930866fed2a3e165228\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gkfn4" Nov 24 06:56:53.176437 kubelet[2932]: E1124 06:56:53.176344 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-gkfn4_kube-system(7332a908-d485-404d-91ee-a472cfca4232)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-gkfn4_kube-system(7332a908-d485-404d-91ee-a472cfca4232)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c17f369173b2481b565b4ba02688297b1c9730f11fc0a930866fed2a3e165228\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-gkfn4" podUID="7332a908-d485-404d-91ee-a472cfca4232" Nov 24 06:56:53.184862 containerd[1621]: time="2025-11-24T06:56:53.184759681Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79b4855f45-7htjs,Uid:e9872522-30cd-4303-b0f4-9d477ec17bc5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"46d943683baab63567f337679ceda9224f8a59033784d248eace97876bc0e735\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.185317 kubelet[2932]: E1124 06:56:53.185218 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46d943683baab63567f337679ceda9224f8a59033784d248eace97876bc0e735\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.185317 kubelet[2932]: E1124 06:56:53.185265 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46d943683baab63567f337679ceda9224f8a59033784d248eace97876bc0e735\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" Nov 24 06:56:53.185317 kubelet[2932]: E1124 06:56:53.185281 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46d943683baab63567f337679ceda9224f8a59033784d248eace97876bc0e735\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" Nov 24 06:56:53.185418 kubelet[2932]: E1124 06:56:53.185310 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-79b4855f45-7htjs_calico-system(e9872522-30cd-4303-b0f4-9d477ec17bc5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-79b4855f45-7htjs_calico-system(e9872522-30cd-4303-b0f4-9d477ec17bc5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"46d943683baab63567f337679ceda9224f8a59033784d248eace97876bc0e735\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:56:53.194240 containerd[1621]: time="2025-11-24T06:56:53.194141564Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-swsqx,Uid:3079118d-9876-4056-a671-92f88f5f8c3d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e139b8cd425779706858c70ddce5a93c9a1e5c2b01b385a03b46cf72cfa523e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.194374 kubelet[2932]: E1124 06:56:53.194333 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e139b8cd425779706858c70ddce5a93c9a1e5c2b01b385a03b46cf72cfa523e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.194407 kubelet[2932]: E1124 06:56:53.194385 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e139b8cd425779706858c70ddce5a93c9a1e5c2b01b385a03b46cf72cfa523e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-swsqx" Nov 24 06:56:53.194407 kubelet[2932]: E1124 06:56:53.194399 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e139b8cd425779706858c70ddce5a93c9a1e5c2b01b385a03b46cf72cfa523e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-swsqx" Nov 24 06:56:53.194467 kubelet[2932]: E1124 06:56:53.194443 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-swsqx_calico-system(3079118d-9876-4056-a671-92f88f5f8c3d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-swsqx_calico-system(3079118d-9876-4056-a671-92f88f5f8c3d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e139b8cd425779706858c70ddce5a93c9a1e5c2b01b385a03b46cf72cfa523e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:56:53.200450 containerd[1621]: time="2025-11-24T06:56:53.200387164Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c65846d8b-csbwd,Uid:56a1253d-b0f7-4032-98a1-7eca8d8f6d62,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4cfea463e24d9524b610a278803ec96ef52d7c73e2ed17dd1ec91efd07b3dc9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.200525 kubelet[2932]: E1124 06:56:53.200510 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4cfea463e24d9524b610a278803ec96ef52d7c73e2ed17dd1ec91efd07b3dc9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.200557 kubelet[2932]: E1124 06:56:53.200539 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4cfea463e24d9524b610a278803ec96ef52d7c73e2ed17dd1ec91efd07b3dc9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" Nov 24 06:56:53.200578 kubelet[2932]: E1124 06:56:53.200558 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4cfea463e24d9524b610a278803ec96ef52d7c73e2ed17dd1ec91efd07b3dc9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" Nov 24 06:56:53.200599 kubelet[2932]: E1124 06:56:53.200586 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7c65846d8b-csbwd_calico-apiserver(56a1253d-b0f7-4032-98a1-7eca8d8f6d62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7c65846d8b-csbwd_calico-apiserver(56a1253d-b0f7-4032-98a1-7eca8d8f6d62)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4cfea463e24d9524b610a278803ec96ef52d7c73e2ed17dd1ec91efd07b3dc9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:56:53.208446 containerd[1621]: time="2025-11-24T06:56:53.208349331Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5jnz6,Uid:c9b82770-9b92-4ba0-9d3c-7b0c3edc869c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe63cd797dbe24b86ccc2718e766ddd7bc1531dedb621a8170d7f1af0b860dee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.208685 kubelet[2932]: E1124 06:56:53.208651 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe63cd797dbe24b86ccc2718e766ddd7bc1531dedb621a8170d7f1af0b860dee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.209232 kubelet[2932]: E1124 06:56:53.208755 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe63cd797dbe24b86ccc2718e766ddd7bc1531dedb621a8170d7f1af0b860dee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5jnz6" Nov 24 06:56:53.209232 kubelet[2932]: E1124 06:56:53.209169 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe63cd797dbe24b86ccc2718e766ddd7bc1531dedb621a8170d7f1af0b860dee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5jnz6" Nov 24 06:56:53.209232 kubelet[2932]: E1124 06:56:53.209201 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-5jnz6_kube-system(c9b82770-9b92-4ba0-9d3c-7b0c3edc869c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-5jnz6_kube-system(c9b82770-9b92-4ba0-9d3c-7b0c3edc869c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fe63cd797dbe24b86ccc2718e766ddd7bc1531dedb621a8170d7f1af0b860dee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-5jnz6" podUID="c9b82770-9b92-4ba0-9d3c-7b0c3edc869c" Nov 24 06:56:53.283700 containerd[1621]: time="2025-11-24T06:56:53.283459095Z" level=error msg="Failed to destroy network for sandbox \"774eefc59d2890ad8b960ca77a69aec0ed9b13edb9fedaa37c5bb4e9adc79edb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.294567 containerd[1621]: time="2025-11-24T06:56:53.294500848Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hsxdh,Uid:ec589a89-1333-4d00-aa6a-417830a62536,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"774eefc59d2890ad8b960ca77a69aec0ed9b13edb9fedaa37c5bb4e9adc79edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.294849 kubelet[2932]: E1124 06:56:53.294826 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"774eefc59d2890ad8b960ca77a69aec0ed9b13edb9fedaa37c5bb4e9adc79edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Nov 24 06:56:53.294996 kubelet[2932]: E1124 06:56:53.294940 2932 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"774eefc59d2890ad8b960ca77a69aec0ed9b13edb9fedaa37c5bb4e9adc79edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hsxdh" Nov 24 06:56:53.294996 kubelet[2932]: E1124 06:56:53.294961 2932 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"774eefc59d2890ad8b960ca77a69aec0ed9b13edb9fedaa37c5bb4e9adc79edb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hsxdh" Nov 24 06:56:53.295110 kubelet[2932]: E1124 06:56:53.295091 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"774eefc59d2890ad8b960ca77a69aec0ed9b13edb9fedaa37c5bb4e9adc79edb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:56:53.917110 systemd[1]: run-netns-cni\x2db5a8384f\x2d5bfd\x2d3ff6\x2d47cb\x2d86fb82390cd6.mount: Deactivated successfully. Nov 24 06:56:53.917222 systemd[1]: run-netns-cni\x2d0707582f\x2db24c\x2d3203\x2dafd0\x2d38b3ab8c5c97.mount: Deactivated successfully. Nov 24 06:56:53.917266 systemd[1]: run-netns-cni\x2d739e5a45\x2d144e\x2db94a\x2df55e\x2da6cb417ce8ad.mount: Deactivated successfully. Nov 24 06:56:53.917343 systemd[1]: run-netns-cni\x2d4f020db7\x2d6973\x2d547d\x2d2274\x2db444103b8dda.mount: Deactivated successfully. Nov 24 06:56:57.455653 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2683058519.mount: Deactivated successfully. Nov 24 06:56:57.564204 containerd[1621]: time="2025-11-24T06:56:57.557018437Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:57.596948 containerd[1621]: time="2025-11-24T06:56:57.596907707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Nov 24 06:56:57.614395 containerd[1621]: time="2025-11-24T06:56:57.614334920Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:57.640996 containerd[1621]: time="2025-11-24T06:56:57.640943380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Nov 24 06:56:57.643361 containerd[1621]: time="2025-11-24T06:56:57.643288450Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 5.356875625s" Nov 24 06:56:57.643361 containerd[1621]: time="2025-11-24T06:56:57.643312535Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Nov 24 06:56:57.803953 containerd[1621]: time="2025-11-24T06:56:57.803912746Z" level=info msg="CreateContainer within sandbox \"0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Nov 24 06:56:58.850096 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1545609100.mount: Deactivated successfully. Nov 24 06:56:58.851097 containerd[1621]: time="2025-11-24T06:56:58.850667073Z" level=info msg="Container f850d9bdde32eb3638354713f6f9b4d4dae3916292687f31267b0bcbd1ab4f73: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:56:58.978384 containerd[1621]: time="2025-11-24T06:56:58.978275538Z" level=info msg="CreateContainer within sandbox \"0b2293ea10ff70440ba1ed0f8b96e8df8f57d910b8dbcf1eaf70574e14b3975a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f850d9bdde32eb3638354713f6f9b4d4dae3916292687f31267b0bcbd1ab4f73\"" Nov 24 06:56:58.978837 containerd[1621]: time="2025-11-24T06:56:58.978809759Z" level=info msg="StartContainer for \"f850d9bdde32eb3638354713f6f9b4d4dae3916292687f31267b0bcbd1ab4f73\"" Nov 24 06:56:58.993698 containerd[1621]: time="2025-11-24T06:56:58.993614408Z" level=info msg="connecting to shim f850d9bdde32eb3638354713f6f9b4d4dae3916292687f31267b0bcbd1ab4f73" address="unix:///run/containerd/s/1c986947fb8b3e68d47495aa38e566039470b9d2f78d5b20f764c4e1208381f5" protocol=ttrpc version=3 Nov 24 06:56:59.237785 systemd[1]: Started cri-containerd-f850d9bdde32eb3638354713f6f9b4d4dae3916292687f31267b0bcbd1ab4f73.scope - libcontainer container f850d9bdde32eb3638354713f6f9b4d4dae3916292687f31267b0bcbd1ab4f73. Nov 24 06:56:59.352601 containerd[1621]: time="2025-11-24T06:56:59.352579924Z" level=info msg="StartContainer for \"f850d9bdde32eb3638354713f6f9b4d4dae3916292687f31267b0bcbd1ab4f73\" returns successfully" Nov 24 06:57:00.266664 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Nov 24 06:57:00.268951 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Nov 24 06:57:00.583655 kubelet[2932]: I1124 06:57:00.576205 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wlp6h" podStartSLOduration=4.194937051 podStartE2EDuration="20.576184305s" podCreationTimestamp="2025-11-24 06:56:40 +0000 UTC" firstStartedPulling="2025-11-24 06:56:41.262542499 +0000 UTC m=+20.277614298" lastFinishedPulling="2025-11-24 06:56:57.643789749 +0000 UTC m=+36.658861552" observedRunningTime="2025-11-24 06:57:00.481164642 +0000 UTC m=+39.496236464" watchObservedRunningTime="2025-11-24 06:57:00.576184305 +0000 UTC m=+39.591256112" Nov 24 06:57:01.086572 kubelet[2932]: I1124 06:57:01.086175 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a8c5d990-4028-4565-8488-4dcc003e63da-whisker-backend-key-pair\") pod \"a8c5d990-4028-4565-8488-4dcc003e63da\" (UID: \"a8c5d990-4028-4565-8488-4dcc003e63da\") " Nov 24 06:57:01.086572 kubelet[2932]: I1124 06:57:01.086236 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8c5d990-4028-4565-8488-4dcc003e63da-whisker-ca-bundle\") pod \"a8c5d990-4028-4565-8488-4dcc003e63da\" (UID: \"a8c5d990-4028-4565-8488-4dcc003e63da\") " Nov 24 06:57:01.086572 kubelet[2932]: I1124 06:57:01.086279 2932 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kccq\" (UniqueName: \"kubernetes.io/projected/a8c5d990-4028-4565-8488-4dcc003e63da-kube-api-access-8kccq\") pod \"a8c5d990-4028-4565-8488-4dcc003e63da\" (UID: \"a8c5d990-4028-4565-8488-4dcc003e63da\") " Nov 24 06:57:01.099150 kubelet[2932]: I1124 06:57:01.099126 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c5d990-4028-4565-8488-4dcc003e63da-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "a8c5d990-4028-4565-8488-4dcc003e63da" (UID: "a8c5d990-4028-4565-8488-4dcc003e63da"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Nov 24 06:57:01.111486 kubelet[2932]: I1124 06:57:01.111455 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c5d990-4028-4565-8488-4dcc003e63da-kube-api-access-8kccq" (OuterVolumeSpecName: "kube-api-access-8kccq") pod "a8c5d990-4028-4565-8488-4dcc003e63da" (UID: "a8c5d990-4028-4565-8488-4dcc003e63da"). InnerVolumeSpecName "kube-api-access-8kccq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Nov 24 06:57:01.111829 systemd[1]: var-lib-kubelet-pods-a8c5d990\x2d4028\x2d4565\x2d8488\x2d4dcc003e63da-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8kccq.mount: Deactivated successfully. Nov 24 06:57:01.115633 kubelet[2932]: I1124 06:57:01.115532 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8c5d990-4028-4565-8488-4dcc003e63da-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "a8c5d990-4028-4565-8488-4dcc003e63da" (UID: "a8c5d990-4028-4565-8488-4dcc003e63da"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Nov 24 06:57:01.117270 systemd[1]: var-lib-kubelet-pods-a8c5d990\x2d4028\x2d4565\x2d8488\x2d4dcc003e63da-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Nov 24 06:57:01.129321 systemd[1]: Removed slice kubepods-besteffort-poda8c5d990_4028_4565_8488_4dcc003e63da.slice - libcontainer container kubepods-besteffort-poda8c5d990_4028_4565_8488_4dcc003e63da.slice. Nov 24 06:57:01.187501 kubelet[2932]: I1124 06:57:01.187436 2932 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kccq\" (UniqueName: \"kubernetes.io/projected/a8c5d990-4028-4565-8488-4dcc003e63da-kube-api-access-8kccq\") on node \"localhost\" DevicePath \"\"" Nov 24 06:57:01.187501 kubelet[2932]: I1124 06:57:01.187470 2932 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a8c5d990-4028-4565-8488-4dcc003e63da-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Nov 24 06:57:01.187501 kubelet[2932]: I1124 06:57:01.187480 2932 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8c5d990-4028-4565-8488-4dcc003e63da-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Nov 24 06:57:01.312396 kubelet[2932]: I1124 06:57:01.312275 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 06:57:01.571208 systemd[1]: Created slice kubepods-besteffort-podd262c589_dcbf_4568_b396_12186ab1a67f.slice - libcontainer container kubepods-besteffort-podd262c589_dcbf_4568_b396_12186ab1a67f.slice. Nov 24 06:57:01.588961 kubelet[2932]: I1124 06:57:01.588927 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x6zs\" (UniqueName: \"kubernetes.io/projected/d262c589-dcbf-4568-b396-12186ab1a67f-kube-api-access-8x6zs\") pod \"whisker-7d558dc64d-9gf9z\" (UID: \"d262c589-dcbf-4568-b396-12186ab1a67f\") " pod="calico-system/whisker-7d558dc64d-9gf9z" Nov 24 06:57:01.588961 kubelet[2932]: I1124 06:57:01.588968 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d262c589-dcbf-4568-b396-12186ab1a67f-whisker-backend-key-pair\") pod \"whisker-7d558dc64d-9gf9z\" (UID: \"d262c589-dcbf-4568-b396-12186ab1a67f\") " pod="calico-system/whisker-7d558dc64d-9gf9z" Nov 24 06:57:01.589451 kubelet[2932]: I1124 06:57:01.588985 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d262c589-dcbf-4568-b396-12186ab1a67f-whisker-ca-bundle\") pod \"whisker-7d558dc64d-9gf9z\" (UID: \"d262c589-dcbf-4568-b396-12186ab1a67f\") " pod="calico-system/whisker-7d558dc64d-9gf9z" Nov 24 06:57:01.881872 containerd[1621]: time="2025-11-24T06:57:01.881486442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d558dc64d-9gf9z,Uid:d262c589-dcbf-4568-b396-12186ab1a67f,Namespace:calico-system,Attempt:0,}" Nov 24 06:57:02.341453 systemd-networkd[1502]: cali07f44e3ff4a: Link UP Nov 24 06:57:02.341944 systemd-networkd[1502]: cali07f44e3ff4a: Gained carrier Nov 24 06:57:02.354331 containerd[1621]: 2025-11-24 06:57:02.009 [INFO][4157] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Nov 24 06:57:02.354331 containerd[1621]: 2025-11-24 06:57:02.049 [INFO][4157] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--7d558dc64d--9gf9z-eth0 whisker-7d558dc64d- calico-system d262c589-dcbf-4568-b396-12186ab1a67f 905 0 2025-11-24 06:57:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7d558dc64d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-7d558dc64d-9gf9z eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali07f44e3ff4a [] [] }} ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Namespace="calico-system" Pod="whisker-7d558dc64d-9gf9z" WorkloadEndpoint="localhost-k8s-whisker--7d558dc64d--9gf9z-" Nov 24 06:57:02.354331 containerd[1621]: 2025-11-24 06:57:02.049 [INFO][4157] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Namespace="calico-system" Pod="whisker-7d558dc64d-9gf9z" WorkloadEndpoint="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" Nov 24 06:57:02.354331 containerd[1621]: 2025-11-24 06:57:02.278 [INFO][4170] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" HandleID="k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Workload="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.288 [INFO][4170] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" HandleID="k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Workload="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00027e2e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-7d558dc64d-9gf9z", "timestamp":"2025-11-24 06:57:02.278785665 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.288 [INFO][4170] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.288 [INFO][4170] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.289 [INFO][4170] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.302 [INFO][4170] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" host="localhost" Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.314 [INFO][4170] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.318 [INFO][4170] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.319 [INFO][4170] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.320 [INFO][4170] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:02.354557 containerd[1621]: 2025-11-24 06:57:02.320 [INFO][4170] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" host="localhost" Nov 24 06:57:02.355262 containerd[1621]: 2025-11-24 06:57:02.321 [INFO][4170] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24 Nov 24 06:57:02.355262 containerd[1621]: 2025-11-24 06:57:02.323 [INFO][4170] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" host="localhost" Nov 24 06:57:02.355262 containerd[1621]: 2025-11-24 06:57:02.326 [INFO][4170] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" host="localhost" Nov 24 06:57:02.355262 containerd[1621]: 2025-11-24 06:57:02.326 [INFO][4170] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" host="localhost" Nov 24 06:57:02.355262 containerd[1621]: 2025-11-24 06:57:02.326 [INFO][4170] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:02.355262 containerd[1621]: 2025-11-24 06:57:02.326 [INFO][4170] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" HandleID="k8s-pod-network.11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Workload="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" Nov 24 06:57:02.355727 containerd[1621]: 2025-11-24 06:57:02.327 [INFO][4157] cni-plugin/k8s.go 418: Populated endpoint ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Namespace="calico-system" Pod="whisker-7d558dc64d-9gf9z" WorkloadEndpoint="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7d558dc64d--9gf9z-eth0", GenerateName:"whisker-7d558dc64d-", Namespace:"calico-system", SelfLink:"", UID:"d262c589-dcbf-4568-b396-12186ab1a67f", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 57, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7d558dc64d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-7d558dc64d-9gf9z", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali07f44e3ff4a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:02.355727 containerd[1621]: 2025-11-24 06:57:02.328 [INFO][4157] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Namespace="calico-system" Pod="whisker-7d558dc64d-9gf9z" WorkloadEndpoint="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" Nov 24 06:57:02.355826 containerd[1621]: 2025-11-24 06:57:02.328 [INFO][4157] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali07f44e3ff4a ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Namespace="calico-system" Pod="whisker-7d558dc64d-9gf9z" WorkloadEndpoint="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" Nov 24 06:57:02.355826 containerd[1621]: 2025-11-24 06:57:02.342 [INFO][4157] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Namespace="calico-system" Pod="whisker-7d558dc64d-9gf9z" WorkloadEndpoint="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" Nov 24 06:57:02.356579 containerd[1621]: 2025-11-24 06:57:02.343 [INFO][4157] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Namespace="calico-system" Pod="whisker-7d558dc64d-9gf9z" WorkloadEndpoint="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7d558dc64d--9gf9z-eth0", GenerateName:"whisker-7d558dc64d-", Namespace:"calico-system", SelfLink:"", UID:"d262c589-dcbf-4568-b396-12186ab1a67f", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 57, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7d558dc64d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24", Pod:"whisker-7d558dc64d-9gf9z", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali07f44e3ff4a", MAC:"c6:04:59:c7:44:44", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:02.356724 containerd[1621]: 2025-11-24 06:57:02.350 [INFO][4157] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" Namespace="calico-system" Pod="whisker-7d558dc64d-9gf9z" WorkloadEndpoint="localhost-k8s-whisker--7d558dc64d--9gf9z-eth0" Nov 24 06:57:02.507232 containerd[1621]: time="2025-11-24T06:57:02.506990003Z" level=info msg="connecting to shim 11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24" address="unix:///run/containerd/s/1c5457842b02abd3c27c25d190b2dbefd6c2ad196b3282b83eb53c0ef09ee294" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:02.527736 systemd[1]: Started cri-containerd-11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24.scope - libcontainer container 11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24. Nov 24 06:57:02.537141 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:02.571153 containerd[1621]: time="2025-11-24T06:57:02.571088531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d558dc64d-9gf9z,Uid:d262c589-dcbf-4568-b396-12186ab1a67f,Namespace:calico-system,Attempt:0,} returns sandbox id \"11b0ee5c8512c61429c1434178d91f4bd12bab75fd683eacfd66435370809b24\"" Nov 24 06:57:02.589861 containerd[1621]: time="2025-11-24T06:57:02.589808943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Nov 24 06:57:03.001285 containerd[1621]: time="2025-11-24T06:57:03.001235128Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:03.001641 containerd[1621]: time="2025-11-24T06:57:03.001605477Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Nov 24 06:57:03.001702 containerd[1621]: time="2025-11-24T06:57:03.001679484Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Nov 24 06:57:03.007351 kubelet[2932]: E1124 06:57:03.004336 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:57:03.009706 kubelet[2932]: E1124 06:57:03.009644 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:57:03.012576 kubelet[2932]: E1124 06:57:03.012432 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f83afd19fc7845bbac116113552ede1f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8x6zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d558dc64d-9gf9z_calico-system(d262c589-dcbf-4568-b396-12186ab1a67f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:03.014285 containerd[1621]: time="2025-11-24T06:57:03.014046934Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Nov 24 06:57:03.109290 kubelet[2932]: I1124 06:57:03.109265 2932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c5d990-4028-4565-8488-4dcc003e63da" path="/var/lib/kubelet/pods/a8c5d990-4028-4565-8488-4dcc003e63da/volumes" Nov 24 06:57:03.351448 containerd[1621]: time="2025-11-24T06:57:03.351419093Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:03.360007 containerd[1621]: time="2025-11-24T06:57:03.359981193Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Nov 24 06:57:03.360096 containerd[1621]: time="2025-11-24T06:57:03.360029253Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Nov 24 06:57:03.360191 kubelet[2932]: E1124 06:57:03.360148 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:57:03.360232 kubelet[2932]: E1124 06:57:03.360199 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:57:03.360477 kubelet[2932]: E1124 06:57:03.360291 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8x6zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d558dc64d-9gf9z_calico-system(d262c589-dcbf-4568-b396-12186ab1a67f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:03.361440 kubelet[2932]: E1124 06:57:03.361414 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:57:03.646915 systemd-networkd[1502]: cali07f44e3ff4a: Gained IPv6LL Nov 24 06:57:04.104824 containerd[1621]: time="2025-11-24T06:57:04.104497215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c65846d8b-csbwd,Uid:56a1253d-b0f7-4032-98a1-7eca8d8f6d62,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:57:04.191050 systemd-networkd[1502]: calic65b2d6e423: Link UP Nov 24 06:57:04.191921 systemd-networkd[1502]: calic65b2d6e423: Gained carrier Nov 24 06:57:04.202572 containerd[1621]: 2025-11-24 06:57:04.129 [INFO][4253] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Nov 24 06:57:04.202572 containerd[1621]: 2025-11-24 06:57:04.137 [INFO][4253] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0 calico-apiserver-7c65846d8b- calico-apiserver 56a1253d-b0f7-4032-98a1-7eca8d8f6d62 833 0 2025-11-24 06:56:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7c65846d8b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7c65846d8b-csbwd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic65b2d6e423 [] [] }} ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-csbwd" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-" Nov 24 06:57:04.202572 containerd[1621]: 2025-11-24 06:57:04.138 [INFO][4253] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-csbwd" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" Nov 24 06:57:04.202572 containerd[1621]: 2025-11-24 06:57:04.166 [INFO][4265] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" HandleID="k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Workload="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.166 [INFO][4265] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" HandleID="k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Workload="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024efe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7c65846d8b-csbwd", "timestamp":"2025-11-24 06:57:04.16688094 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.167 [INFO][4265] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.167 [INFO][4265] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.167 [INFO][4265] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.171 [INFO][4265] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" host="localhost" Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.173 [INFO][4265] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.175 [INFO][4265] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.176 [INFO][4265] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.178 [INFO][4265] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:04.202816 containerd[1621]: 2025-11-24 06:57:04.178 [INFO][4265] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" host="localhost" Nov 24 06:57:04.205148 containerd[1621]: 2025-11-24 06:57:04.179 [INFO][4265] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10 Nov 24 06:57:04.205148 containerd[1621]: 2025-11-24 06:57:04.181 [INFO][4265] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" host="localhost" Nov 24 06:57:04.205148 containerd[1621]: 2025-11-24 06:57:04.185 [INFO][4265] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" host="localhost" Nov 24 06:57:04.205148 containerd[1621]: 2025-11-24 06:57:04.185 [INFO][4265] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" host="localhost" Nov 24 06:57:04.205148 containerd[1621]: 2025-11-24 06:57:04.185 [INFO][4265] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:04.205148 containerd[1621]: 2025-11-24 06:57:04.185 [INFO][4265] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" HandleID="k8s-pod-network.25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Workload="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" Nov 24 06:57:04.205306 containerd[1621]: 2025-11-24 06:57:04.187 [INFO][4253] cni-plugin/k8s.go 418: Populated endpoint ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-csbwd" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0", GenerateName:"calico-apiserver-7c65846d8b-", Namespace:"calico-apiserver", SelfLink:"", UID:"56a1253d-b0f7-4032-98a1-7eca8d8f6d62", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c65846d8b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7c65846d8b-csbwd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic65b2d6e423", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:04.205370 containerd[1621]: 2025-11-24 06:57:04.187 [INFO][4253] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-csbwd" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" Nov 24 06:57:04.205370 containerd[1621]: 2025-11-24 06:57:04.187 [INFO][4253] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic65b2d6e423 ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-csbwd" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" Nov 24 06:57:04.205370 containerd[1621]: 2025-11-24 06:57:04.190 [INFO][4253] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-csbwd" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" Nov 24 06:57:04.205447 containerd[1621]: 2025-11-24 06:57:04.190 [INFO][4253] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-csbwd" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0", GenerateName:"calico-apiserver-7c65846d8b-", Namespace:"calico-apiserver", SelfLink:"", UID:"56a1253d-b0f7-4032-98a1-7eca8d8f6d62", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c65846d8b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10", Pod:"calico-apiserver-7c65846d8b-csbwd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic65b2d6e423", MAC:"e2:53:f9:d1:31:1c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:04.205490 containerd[1621]: 2025-11-24 06:57:04.197 [INFO][4253] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-csbwd" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--csbwd-eth0" Nov 24 06:57:04.227036 containerd[1621]: time="2025-11-24T06:57:04.227002424Z" level=info msg="connecting to shim 25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10" address="unix:///run/containerd/s/0b1bdb88b54de2d3a4ece547b252199f1c8f20cc750e51fb126964bc4879ef3d" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:04.262165 systemd[1]: Started cri-containerd-25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10.scope - libcontainer container 25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10. Nov 24 06:57:04.277406 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:04.323072 containerd[1621]: time="2025-11-24T06:57:04.323040888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c65846d8b-csbwd,Uid:56a1253d-b0f7-4032-98a1-7eca8d8f6d62,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"25357f2feb8a8c3d53eb44f8c47bf53718a7b50356ad478420e2d733a2f33b10\"" Nov 24 06:57:04.323193 kubelet[2932]: E1124 06:57:04.323081 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:57:04.326755 containerd[1621]: time="2025-11-24T06:57:04.326730590Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:04.662744 containerd[1621]: time="2025-11-24T06:57:04.662700874Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:04.665099 containerd[1621]: time="2025-11-24T06:57:04.665060861Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:04.665099 containerd[1621]: time="2025-11-24T06:57:04.665081221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:04.665235 kubelet[2932]: E1124 06:57:04.665197 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:04.665325 kubelet[2932]: E1124 06:57:04.665233 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:04.665360 kubelet[2932]: E1124 06:57:04.665333 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kszgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7c65846d8b-csbwd_calico-apiserver(56a1253d-b0f7-4032-98a1-7eca8d8f6d62): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:04.666596 kubelet[2932]: E1124 06:57:04.666564 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:57:04.727859 kubelet[2932]: I1124 06:57:04.727752 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 06:57:05.105504 containerd[1621]: time="2025-11-24T06:57:05.105271066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hsxdh,Uid:ec589a89-1333-4d00-aa6a-417830a62536,Namespace:calico-system,Attempt:0,}" Nov 24 06:57:05.238051 systemd-networkd[1502]: calib6f58e9a99f: Link UP Nov 24 06:57:05.239448 systemd-networkd[1502]: calib6f58e9a99f: Gained carrier Nov 24 06:57:05.266715 containerd[1621]: 2025-11-24 06:57:05.146 [INFO][4350] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Nov 24 06:57:05.266715 containerd[1621]: 2025-11-24 06:57:05.155 [INFO][4350] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--hsxdh-eth0 csi-node-driver- calico-system ec589a89-1333-4d00-aa6a-417830a62536 723 0 2025-11-24 06:56:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-hsxdh eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib6f58e9a99f [] [] }} ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Namespace="calico-system" Pod="csi-node-driver-hsxdh" WorkloadEndpoint="localhost-k8s-csi--node--driver--hsxdh-" Nov 24 06:57:05.266715 containerd[1621]: 2025-11-24 06:57:05.155 [INFO][4350] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Namespace="calico-system" Pod="csi-node-driver-hsxdh" WorkloadEndpoint="localhost-k8s-csi--node--driver--hsxdh-eth0" Nov 24 06:57:05.266715 containerd[1621]: 2025-11-24 06:57:05.189 [INFO][4361] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" HandleID="k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Workload="localhost-k8s-csi--node--driver--hsxdh-eth0" Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.189 [INFO][4361] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" HandleID="k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Workload="localhost-k8s-csi--node--driver--hsxdh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-hsxdh", "timestamp":"2025-11-24 06:57:05.189404354 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.189 [INFO][4361] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.189 [INFO][4361] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.189 [INFO][4361] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.197 [INFO][4361] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" host="localhost" Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.200 [INFO][4361] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.205 [INFO][4361] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.208 [INFO][4361] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.210 [INFO][4361] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:05.266907 containerd[1621]: 2025-11-24 06:57:05.210 [INFO][4361] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" host="localhost" Nov 24 06:57:05.267281 containerd[1621]: 2025-11-24 06:57:05.211 [INFO][4361] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580 Nov 24 06:57:05.267281 containerd[1621]: 2025-11-24 06:57:05.226 [INFO][4361] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" host="localhost" Nov 24 06:57:05.267281 containerd[1621]: 2025-11-24 06:57:05.232 [INFO][4361] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" host="localhost" Nov 24 06:57:05.267281 containerd[1621]: 2025-11-24 06:57:05.232 [INFO][4361] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" host="localhost" Nov 24 06:57:05.267281 containerd[1621]: 2025-11-24 06:57:05.232 [INFO][4361] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:05.267281 containerd[1621]: 2025-11-24 06:57:05.232 [INFO][4361] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" HandleID="k8s-pod-network.9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Workload="localhost-k8s-csi--node--driver--hsxdh-eth0" Nov 24 06:57:05.267394 containerd[1621]: 2025-11-24 06:57:05.234 [INFO][4350] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Namespace="calico-system" Pod="csi-node-driver-hsxdh" WorkloadEndpoint="localhost-k8s-csi--node--driver--hsxdh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--hsxdh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ec589a89-1333-4d00-aa6a-417830a62536", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-hsxdh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib6f58e9a99f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:05.267453 containerd[1621]: 2025-11-24 06:57:05.234 [INFO][4350] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Namespace="calico-system" Pod="csi-node-driver-hsxdh" WorkloadEndpoint="localhost-k8s-csi--node--driver--hsxdh-eth0" Nov 24 06:57:05.267453 containerd[1621]: 2025-11-24 06:57:05.235 [INFO][4350] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib6f58e9a99f ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Namespace="calico-system" Pod="csi-node-driver-hsxdh" WorkloadEndpoint="localhost-k8s-csi--node--driver--hsxdh-eth0" Nov 24 06:57:05.267453 containerd[1621]: 2025-11-24 06:57:05.240 [INFO][4350] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Namespace="calico-system" Pod="csi-node-driver-hsxdh" WorkloadEndpoint="localhost-k8s-csi--node--driver--hsxdh-eth0" Nov 24 06:57:05.267516 containerd[1621]: 2025-11-24 06:57:05.241 [INFO][4350] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Namespace="calico-system" Pod="csi-node-driver-hsxdh" WorkloadEndpoint="localhost-k8s-csi--node--driver--hsxdh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--hsxdh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ec589a89-1333-4d00-aa6a-417830a62536", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580", Pod:"csi-node-driver-hsxdh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib6f58e9a99f", MAC:"ca:d9:4b:d2:59:92", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:05.267564 containerd[1621]: 2025-11-24 06:57:05.263 [INFO][4350] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" Namespace="calico-system" Pod="csi-node-driver-hsxdh" WorkloadEndpoint="localhost-k8s-csi--node--driver--hsxdh-eth0" Nov 24 06:57:05.288339 containerd[1621]: time="2025-11-24T06:57:05.287996481Z" level=info msg="connecting to shim 9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580" address="unix:///run/containerd/s/80f56eb6c2698a51e5d222de70acb80fc6ff60eea080f5e18241f5ee94dda14a" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:05.319200 systemd[1]: Started cri-containerd-9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580.scope - libcontainer container 9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580. Nov 24 06:57:05.323565 kubelet[2932]: E1124 06:57:05.323512 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:57:05.340490 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:05.362039 containerd[1621]: time="2025-11-24T06:57:05.361956801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hsxdh,Uid:ec589a89-1333-4d00-aa6a-417830a62536,Namespace:calico-system,Attempt:0,} returns sandbox id \"9ca0d771e34a6b285f657c5a2ce27fdb9a8fe3bc646bca9ad1a51cd6e2e76580\"" Nov 24 06:57:05.364741 containerd[1621]: time="2025-11-24T06:57:05.364717290Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Nov 24 06:57:05.436868 systemd-networkd[1502]: calic65b2d6e423: Gained IPv6LL Nov 24 06:57:05.702566 systemd-networkd[1502]: vxlan.calico: Link UP Nov 24 06:57:05.702571 systemd-networkd[1502]: vxlan.calico: Gained carrier Nov 24 06:57:05.711469 containerd[1621]: time="2025-11-24T06:57:05.711415098Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:05.713141 containerd[1621]: time="2025-11-24T06:57:05.711884718Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Nov 24 06:57:05.713141 containerd[1621]: time="2025-11-24T06:57:05.711955139Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Nov 24 06:57:05.713237 kubelet[2932]: E1124 06:57:05.712075 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:57:05.713237 kubelet[2932]: E1124 06:57:05.712109 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:57:05.719597 kubelet[2932]: E1124 06:57:05.719543 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqh2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:05.723288 containerd[1621]: time="2025-11-24T06:57:05.723267940Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Nov 24 06:57:05.898451 kubelet[2932]: I1124 06:57:05.898417 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 24 06:57:06.168643 containerd[1621]: time="2025-11-24T06:57:06.168553054Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:06.177895 containerd[1621]: time="2025-11-24T06:57:06.177766401Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Nov 24 06:57:06.177895 containerd[1621]: time="2025-11-24T06:57:06.177826032Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Nov 24 06:57:06.178264 kubelet[2932]: E1124 06:57:06.178079 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:57:06.178264 kubelet[2932]: E1124 06:57:06.178134 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:57:06.178471 kubelet[2932]: E1124 06:57:06.178393 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqh2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:06.179628 kubelet[2932]: E1124 06:57:06.179550 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:57:06.325447 kubelet[2932]: E1124 06:57:06.325199 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:57:06.325960 kubelet[2932]: E1124 06:57:06.325850 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:57:06.972781 systemd-networkd[1502]: vxlan.calico: Gained IPv6LL Nov 24 06:57:07.105592 containerd[1621]: time="2025-11-24T06:57:07.105484035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5jnz6,Uid:c9b82770-9b92-4ba0-9d3c-7b0c3edc869c,Namespace:kube-system,Attempt:0,}" Nov 24 06:57:07.105989 containerd[1621]: time="2025-11-24T06:57:07.105918963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c65846d8b-n55vh,Uid:77c9c1c2-ff92-4d21-b427-835b49d2e048,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:57:07.105989 containerd[1621]: time="2025-11-24T06:57:07.105968237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6757d5779b-2xfwn,Uid:8655a062-8ee8-4565-9af9-1c36ab263987,Namespace:calico-apiserver,Attempt:0,}" Nov 24 06:57:07.106091 containerd[1621]: time="2025-11-24T06:57:07.106063031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gkfn4,Uid:7332a908-d485-404d-91ee-a472cfca4232,Namespace:kube-system,Attempt:0,}" Nov 24 06:57:07.106269 containerd[1621]: time="2025-11-24T06:57:07.106258686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-swsqx,Uid:3079118d-9876-4056-a671-92f88f5f8c3d,Namespace:calico-system,Attempt:0,}" Nov 24 06:57:07.166282 systemd-networkd[1502]: calib6f58e9a99f: Gained IPv6LL Nov 24 06:57:07.364857 systemd-networkd[1502]: cali2b3b243f52d: Link UP Nov 24 06:57:07.365649 systemd-networkd[1502]: cali2b3b243f52d: Gained carrier Nov 24 06:57:07.384255 containerd[1621]: 2025-11-24 06:57:07.203 [INFO][4618] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0 coredns-674b8bbfcf- kube-system c9b82770-9b92-4ba0-9d3c-7b0c3edc869c 834 0 2025-11-24 06:56:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-5jnz6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2b3b243f52d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Namespace="kube-system" Pod="coredns-674b8bbfcf-5jnz6" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5jnz6-" Nov 24 06:57:07.384255 containerd[1621]: 2025-11-24 06:57:07.203 [INFO][4618] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Namespace="kube-system" Pod="coredns-674b8bbfcf-5jnz6" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" Nov 24 06:57:07.384255 containerd[1621]: 2025-11-24 06:57:07.306 [INFO][4672] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" HandleID="k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Workload="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.306 [INFO][4672] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" HandleID="k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Workload="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000329620), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-5jnz6", "timestamp":"2025-11-24 06:57:07.306718175 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.307 [INFO][4672] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.307 [INFO][4672] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.307 [INFO][4672] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.319 [INFO][4672] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" host="localhost" Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.324 [INFO][4672] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.327 [INFO][4672] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.329 [INFO][4672] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.347 [INFO][4672] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.384692 containerd[1621]: 2025-11-24 06:57:07.347 [INFO][4672] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" host="localhost" Nov 24 06:57:07.384875 containerd[1621]: 2025-11-24 06:57:07.348 [INFO][4672] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484 Nov 24 06:57:07.384875 containerd[1621]: 2025-11-24 06:57:07.352 [INFO][4672] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" host="localhost" Nov 24 06:57:07.384875 containerd[1621]: 2025-11-24 06:57:07.356 [INFO][4672] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" host="localhost" Nov 24 06:57:07.384875 containerd[1621]: 2025-11-24 06:57:07.356 [INFO][4672] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" host="localhost" Nov 24 06:57:07.384875 containerd[1621]: 2025-11-24 06:57:07.356 [INFO][4672] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:07.384875 containerd[1621]: 2025-11-24 06:57:07.356 [INFO][4672] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" HandleID="k8s-pod-network.ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Workload="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" Nov 24 06:57:07.385764 containerd[1621]: 2025-11-24 06:57:07.360 [INFO][4618] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Namespace="kube-system" Pod="coredns-674b8bbfcf-5jnz6" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c9b82770-9b92-4ba0-9d3c-7b0c3edc869c", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-5jnz6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2b3b243f52d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.385827 containerd[1621]: 2025-11-24 06:57:07.360 [INFO][4618] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Namespace="kube-system" Pod="coredns-674b8bbfcf-5jnz6" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" Nov 24 06:57:07.385827 containerd[1621]: 2025-11-24 06:57:07.360 [INFO][4618] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b3b243f52d ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Namespace="kube-system" Pod="coredns-674b8bbfcf-5jnz6" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" Nov 24 06:57:07.385827 containerd[1621]: 2025-11-24 06:57:07.367 [INFO][4618] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Namespace="kube-system" Pod="coredns-674b8bbfcf-5jnz6" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" Nov 24 06:57:07.385920 containerd[1621]: 2025-11-24 06:57:07.368 [INFO][4618] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Namespace="kube-system" Pod="coredns-674b8bbfcf-5jnz6" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c9b82770-9b92-4ba0-9d3c-7b0c3edc869c", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484", Pod:"coredns-674b8bbfcf-5jnz6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2b3b243f52d", MAC:"c6:ae:2a:45:dd:0c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.385920 containerd[1621]: 2025-11-24 06:57:07.377 [INFO][4618] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" Namespace="kube-system" Pod="coredns-674b8bbfcf-5jnz6" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5jnz6-eth0" Nov 24 06:57:07.419109 containerd[1621]: time="2025-11-24T06:57:07.419012642Z" level=info msg="connecting to shim ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484" address="unix:///run/containerd/s/727ac6889cde8ac91e77943e58ca901acc462b51285b6eab32c51457a6623d84" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:07.442099 systemd[1]: Started cri-containerd-ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484.scope - libcontainer container ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484. Nov 24 06:57:07.459938 kubelet[2932]: E1124 06:57:07.459898 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:57:07.475689 systemd-networkd[1502]: cali8b984ee59b6: Link UP Nov 24 06:57:07.480781 systemd-networkd[1502]: cali8b984ee59b6: Gained carrier Nov 24 06:57:07.505462 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.190 [INFO][4611] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0 coredns-674b8bbfcf- kube-system 7332a908-d485-404d-91ee-a472cfca4232 825 0 2025-11-24 06:56:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-gkfn4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8b984ee59b6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Namespace="kube-system" Pod="coredns-674b8bbfcf-gkfn4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--gkfn4-" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.191 [INFO][4611] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Namespace="kube-system" Pod="coredns-674b8bbfcf-gkfn4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.310 [INFO][4670] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" HandleID="k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Workload="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.310 [INFO][4670] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" HandleID="k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Workload="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4f40), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-gkfn4", "timestamp":"2025-11-24 06:57:07.31043713 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.310 [INFO][4670] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.356 [INFO][4670] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.356 [INFO][4670] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.421 [INFO][4670] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.434 [INFO][4670] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.440 [INFO][4670] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.441 [INFO][4670] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.445 [INFO][4670] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.445 [INFO][4670] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.446 [INFO][4670] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471 Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.451 [INFO][4670] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.461 [INFO][4670] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.462 [INFO][4670] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" host="localhost" Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.462 [INFO][4670] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:07.508901 containerd[1621]: 2025-11-24 06:57:07.462 [INFO][4670] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" HandleID="k8s-pod-network.67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Workload="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" Nov 24 06:57:07.511957 containerd[1621]: 2025-11-24 06:57:07.467 [INFO][4611] cni-plugin/k8s.go 418: Populated endpoint ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Namespace="kube-system" Pod="coredns-674b8bbfcf-gkfn4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7332a908-d485-404d-91ee-a472cfca4232", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-gkfn4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8b984ee59b6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.511957 containerd[1621]: 2025-11-24 06:57:07.467 [INFO][4611] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Namespace="kube-system" Pod="coredns-674b8bbfcf-gkfn4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" Nov 24 06:57:07.511957 containerd[1621]: 2025-11-24 06:57:07.467 [INFO][4611] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8b984ee59b6 ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Namespace="kube-system" Pod="coredns-674b8bbfcf-gkfn4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" Nov 24 06:57:07.511957 containerd[1621]: 2025-11-24 06:57:07.484 [INFO][4611] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Namespace="kube-system" Pod="coredns-674b8bbfcf-gkfn4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" Nov 24 06:57:07.511957 containerd[1621]: 2025-11-24 06:57:07.488 [INFO][4611] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Namespace="kube-system" Pod="coredns-674b8bbfcf-gkfn4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7332a908-d485-404d-91ee-a472cfca4232", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471", Pod:"coredns-674b8bbfcf-gkfn4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8b984ee59b6", MAC:"aa:89:27:63:6d:72", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.511957 containerd[1621]: 2025-11-24 06:57:07.505 [INFO][4611] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" Namespace="kube-system" Pod="coredns-674b8bbfcf-gkfn4" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--gkfn4-eth0" Nov 24 06:57:07.555257 containerd[1621]: time="2025-11-24T06:57:07.555226220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5jnz6,Uid:c9b82770-9b92-4ba0-9d3c-7b0c3edc869c,Namespace:kube-system,Attempt:0,} returns sandbox id \"ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484\"" Nov 24 06:57:07.563426 containerd[1621]: time="2025-11-24T06:57:07.563251329Z" level=info msg="connecting to shim 67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471" address="unix:///run/containerd/s/1022a469a6b6ef95324f9e4364e8afe04aac5ae3c186784018d6771fc1c3b3a7" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:07.578710 containerd[1621]: time="2025-11-24T06:57:07.578684649Z" level=info msg="CreateContainer within sandbox \"ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Nov 24 06:57:07.604037 systemd-networkd[1502]: cali710e0cb15cb: Link UP Nov 24 06:57:07.605325 systemd-networkd[1502]: cali710e0cb15cb: Gained carrier Nov 24 06:57:07.607734 containerd[1621]: time="2025-11-24T06:57:07.607605218Z" level=info msg="Container 1f2c429182f7fe5103e0ac382826be88af66c56c6fb75773152e9d873f3d8b81: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:57:07.613568 containerd[1621]: time="2025-11-24T06:57:07.613526505Z" level=info msg="CreateContainer within sandbox \"ed4778ed49242cdfe958e0faa4a2944d7fd50aa6f27d4fd3215aa3109cadd484\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1f2c429182f7fe5103e0ac382826be88af66c56c6fb75773152e9d873f3d8b81\"" Nov 24 06:57:07.616330 containerd[1621]: time="2025-11-24T06:57:07.614990037Z" level=info msg="StartContainer for \"1f2c429182f7fe5103e0ac382826be88af66c56c6fb75773152e9d873f3d8b81\"" Nov 24 06:57:07.623079 containerd[1621]: time="2025-11-24T06:57:07.623003884Z" level=info msg="connecting to shim 1f2c429182f7fe5103e0ac382826be88af66c56c6fb75773152e9d873f3d8b81" address="unix:///run/containerd/s/727ac6889cde8ac91e77943e58ca901acc462b51285b6eab32c51457a6623d84" protocol=ttrpc version=3 Nov 24 06:57:07.636911 systemd[1]: Started cri-containerd-67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471.scope - libcontainer container 67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471. Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.212 [INFO][4628] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0 calico-apiserver-7c65846d8b- calico-apiserver 77c9c1c2-ff92-4d21-b427-835b49d2e048 837 0 2025-11-24 06:56:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7c65846d8b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7c65846d8b-n55vh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali710e0cb15cb [] [] }} ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-n55vh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.214 [INFO][4628] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-n55vh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.312 [INFO][4678] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" HandleID="k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Workload="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.312 [INFO][4678] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" HandleID="k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Workload="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf820), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7c65846d8b-n55vh", "timestamp":"2025-11-24 06:57:07.31254108 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.312 [INFO][4678] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.462 [INFO][4678] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.462 [INFO][4678] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.523 [INFO][4678] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.540 [INFO][4678] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.547 [INFO][4678] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.551 [INFO][4678] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.555 [INFO][4678] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.555 [INFO][4678] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.557 [INFO][4678] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774 Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.564 [INFO][4678] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.584 [INFO][4678] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.584 [INFO][4678] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" host="localhost" Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.589 [INFO][4678] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:07.653067 containerd[1621]: 2025-11-24 06:57:07.589 [INFO][4678] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" HandleID="k8s-pod-network.deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Workload="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" Nov 24 06:57:07.658548 containerd[1621]: 2025-11-24 06:57:07.599 [INFO][4628] cni-plugin/k8s.go 418: Populated endpoint ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-n55vh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0", GenerateName:"calico-apiserver-7c65846d8b-", Namespace:"calico-apiserver", SelfLink:"", UID:"77c9c1c2-ff92-4d21-b427-835b49d2e048", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c65846d8b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7c65846d8b-n55vh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali710e0cb15cb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.658548 containerd[1621]: 2025-11-24 06:57:07.599 [INFO][4628] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-n55vh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" Nov 24 06:57:07.658548 containerd[1621]: 2025-11-24 06:57:07.599 [INFO][4628] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali710e0cb15cb ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-n55vh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" Nov 24 06:57:07.658548 containerd[1621]: 2025-11-24 06:57:07.606 [INFO][4628] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-n55vh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" Nov 24 06:57:07.658548 containerd[1621]: 2025-11-24 06:57:07.609 [INFO][4628] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-n55vh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0", GenerateName:"calico-apiserver-7c65846d8b-", Namespace:"calico-apiserver", SelfLink:"", UID:"77c9c1c2-ff92-4d21-b427-835b49d2e048", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c65846d8b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774", Pod:"calico-apiserver-7c65846d8b-n55vh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali710e0cb15cb", MAC:"c6:f1:32:05:f0:ee", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.658548 containerd[1621]: 2025-11-24 06:57:07.647 [INFO][4628] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" Namespace="calico-apiserver" Pod="calico-apiserver-7c65846d8b-n55vh" WorkloadEndpoint="localhost-k8s-calico--apiserver--7c65846d8b--n55vh-eth0" Nov 24 06:57:07.659889 systemd[1]: Started cri-containerd-1f2c429182f7fe5103e0ac382826be88af66c56c6fb75773152e9d873f3d8b81.scope - libcontainer container 1f2c429182f7fe5103e0ac382826be88af66c56c6fb75773152e9d873f3d8b81. Nov 24 06:57:07.687639 containerd[1621]: time="2025-11-24T06:57:07.687578701Z" level=info msg="connecting to shim deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774" address="unix:///run/containerd/s/174f56669d3fe9213e40640e8256e4c8331669d7abee5a84910d17eaf36827e6" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:07.691939 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:07.703769 systemd-networkd[1502]: cali50646ab4607: Link UP Nov 24 06:57:07.705198 systemd-networkd[1502]: cali50646ab4607: Gained carrier Nov 24 06:57:07.732847 systemd[1]: Started cri-containerd-deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774.scope - libcontainer container deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774. Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.224 [INFO][4652] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--666569f655--swsqx-eth0 goldmane-666569f655- calico-system 3079118d-9876-4056-a671-92f88f5f8c3d 835 0 2025-11-24 06:56:39 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-666569f655-swsqx eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali50646ab4607 [] [] }} ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Namespace="calico-system" Pod="goldmane-666569f655-swsqx" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--swsqx-" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.224 [INFO][4652] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Namespace="calico-system" Pod="goldmane-666569f655-swsqx" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--swsqx-eth0" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.326 [INFO][4675] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" HandleID="k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Workload="localhost-k8s-goldmane--666569f655--swsqx-eth0" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.327 [INFO][4675] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" HandleID="k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Workload="localhost-k8s-goldmane--666569f655--swsqx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002bcfe0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-666569f655-swsqx", "timestamp":"2025-11-24 06:57:07.326457425 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.327 [INFO][4675] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.588 [INFO][4675] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.588 [INFO][4675] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.626 [INFO][4675] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.646 [INFO][4675] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.652 [INFO][4675] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.653 [INFO][4675] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.664 [INFO][4675] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.664 [INFO][4675] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.668 [INFO][4675] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796 Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.675 [INFO][4675] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.683 [INFO][4675] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.685 [INFO][4675] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" host="localhost" Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.685 [INFO][4675] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:07.739069 containerd[1621]: 2025-11-24 06:57:07.685 [INFO][4675] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" HandleID="k8s-pod-network.c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Workload="localhost-k8s-goldmane--666569f655--swsqx-eth0" Nov 24 06:57:07.739660 containerd[1621]: 2025-11-24 06:57:07.695 [INFO][4652] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Namespace="calico-system" Pod="goldmane-666569f655-swsqx" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--swsqx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--swsqx-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"3079118d-9876-4056-a671-92f88f5f8c3d", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-666569f655-swsqx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali50646ab4607", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.739660 containerd[1621]: 2025-11-24 06:57:07.696 [INFO][4652] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Namespace="calico-system" Pod="goldmane-666569f655-swsqx" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--swsqx-eth0" Nov 24 06:57:07.739660 containerd[1621]: 2025-11-24 06:57:07.697 [INFO][4652] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali50646ab4607 ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Namespace="calico-system" Pod="goldmane-666569f655-swsqx" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--swsqx-eth0" Nov 24 06:57:07.739660 containerd[1621]: 2025-11-24 06:57:07.705 [INFO][4652] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Namespace="calico-system" Pod="goldmane-666569f655-swsqx" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--swsqx-eth0" Nov 24 06:57:07.739660 containerd[1621]: 2025-11-24 06:57:07.707 [INFO][4652] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Namespace="calico-system" Pod="goldmane-666569f655-swsqx" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--swsqx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--swsqx-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"3079118d-9876-4056-a671-92f88f5f8c3d", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796", Pod:"goldmane-666569f655-swsqx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali50646ab4607", MAC:"2a:de:2b:9d:dd:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.739660 containerd[1621]: 2025-11-24 06:57:07.730 [INFO][4652] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" Namespace="calico-system" Pod="goldmane-666569f655-swsqx" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--swsqx-eth0" Nov 24 06:57:07.765722 containerd[1621]: time="2025-11-24T06:57:07.765565210Z" level=info msg="StartContainer for \"1f2c429182f7fe5103e0ac382826be88af66c56c6fb75773152e9d873f3d8b81\" returns successfully" Nov 24 06:57:07.783390 containerd[1621]: time="2025-11-24T06:57:07.783116027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gkfn4,Uid:7332a908-d485-404d-91ee-a472cfca4232,Namespace:kube-system,Attempt:0,} returns sandbox id \"67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471\"" Nov 24 06:57:07.785499 containerd[1621]: time="2025-11-24T06:57:07.785309407Z" level=info msg="connecting to shim c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796" address="unix:///run/containerd/s/8bbe36a0af256c5af96c1260b5bee58da0fca1709ce8cf16a7171475244e937d" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:07.797205 containerd[1621]: time="2025-11-24T06:57:07.797053538Z" level=info msg="CreateContainer within sandbox \"67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Nov 24 06:57:07.836466 containerd[1621]: time="2025-11-24T06:57:07.836436321Z" level=info msg="Container 3da2437193b055ed620eedec146983aebb18ed431bac732df73013e4835228d4: CDI devices from CRI Config.CDIDevices: []" Nov 24 06:57:07.840850 systemd[1]: Started cri-containerd-c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796.scope - libcontainer container c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796. Nov 24 06:57:07.861845 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:07.867823 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:07.871681 systemd-networkd[1502]: calid8cd48d8e8e: Link UP Nov 24 06:57:07.882493 systemd-networkd[1502]: calid8cd48d8e8e: Gained carrier Nov 24 06:57:07.888329 containerd[1621]: time="2025-11-24T06:57:07.888288128Z" level=info msg="CreateContainer within sandbox \"67d5ce6be2f7e95ef0c3f0b8d98956fd128d11df9aaefb0c06c7d6bb7aa04471\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3da2437193b055ed620eedec146983aebb18ed431bac732df73013e4835228d4\"" Nov 24 06:57:07.890430 containerd[1621]: time="2025-11-24T06:57:07.890395483Z" level=info msg="StartContainer for \"3da2437193b055ed620eedec146983aebb18ed431bac732df73013e4835228d4\"" Nov 24 06:57:07.893490 containerd[1621]: time="2025-11-24T06:57:07.893448470Z" level=info msg="connecting to shim 3da2437193b055ed620eedec146983aebb18ed431bac732df73013e4835228d4" address="unix:///run/containerd/s/1022a469a6b6ef95324f9e4364e8afe04aac5ae3c186784018d6771fc1c3b3a7" protocol=ttrpc version=3 Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.224 [INFO][4642] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0 calico-apiserver-6757d5779b- calico-apiserver 8655a062-8ee8-4565-9af9-1c36ab263987 831 0 2025-11-24 06:56:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6757d5779b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6757d5779b-2xfwn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid8cd48d8e8e [] [] }} ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Namespace="calico-apiserver" Pod="calico-apiserver-6757d5779b-2xfwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.224 [INFO][4642] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Namespace="calico-apiserver" Pod="calico-apiserver-6757d5779b-2xfwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.329 [INFO][4679] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" HandleID="k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Workload="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.329 [INFO][4679] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" HandleID="k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Workload="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000397880), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6757d5779b-2xfwn", "timestamp":"2025-11-24 06:57:07.329527203 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.329 [INFO][4679] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.686 [INFO][4679] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.687 [INFO][4679] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.723 [INFO][4679] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.758 [INFO][4679] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.773 [INFO][4679] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.778 [INFO][4679] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.783 [INFO][4679] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.786 [INFO][4679] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.794 [INFO][4679] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.808 [INFO][4679] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.840 [INFO][4679] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.840 [INFO][4679] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" host="localhost" Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.840 [INFO][4679] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:07.909105 containerd[1621]: 2025-11-24 06:57:07.840 [INFO][4679] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" HandleID="k8s-pod-network.64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Workload="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" Nov 24 06:57:07.909679 containerd[1621]: 2025-11-24 06:57:07.864 [INFO][4642] cni-plugin/k8s.go 418: Populated endpoint ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Namespace="calico-apiserver" Pod="calico-apiserver-6757d5779b-2xfwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0", GenerateName:"calico-apiserver-6757d5779b-", Namespace:"calico-apiserver", SelfLink:"", UID:"8655a062-8ee8-4565-9af9-1c36ab263987", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6757d5779b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6757d5779b-2xfwn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid8cd48d8e8e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.909679 containerd[1621]: 2025-11-24 06:57:07.864 [INFO][4642] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Namespace="calico-apiserver" Pod="calico-apiserver-6757d5779b-2xfwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" Nov 24 06:57:07.909679 containerd[1621]: 2025-11-24 06:57:07.864 [INFO][4642] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid8cd48d8e8e ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Namespace="calico-apiserver" Pod="calico-apiserver-6757d5779b-2xfwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" Nov 24 06:57:07.909679 containerd[1621]: 2025-11-24 06:57:07.883 [INFO][4642] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Namespace="calico-apiserver" Pod="calico-apiserver-6757d5779b-2xfwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" Nov 24 06:57:07.909679 containerd[1621]: 2025-11-24 06:57:07.884 [INFO][4642] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Namespace="calico-apiserver" Pod="calico-apiserver-6757d5779b-2xfwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0", GenerateName:"calico-apiserver-6757d5779b-", Namespace:"calico-apiserver", SelfLink:"", UID:"8655a062-8ee8-4565-9af9-1c36ab263987", ResourceVersion:"831", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6757d5779b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed", Pod:"calico-apiserver-6757d5779b-2xfwn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid8cd48d8e8e", MAC:"6a:ba:ed:59:26:d7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:07.909679 containerd[1621]: 2025-11-24 06:57:07.903 [INFO][4642] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" Namespace="calico-apiserver" Pod="calico-apiserver-6757d5779b-2xfwn" WorkloadEndpoint="localhost-k8s-calico--apiserver--6757d5779b--2xfwn-eth0" Nov 24 06:57:07.924127 systemd[1]: Started cri-containerd-3da2437193b055ed620eedec146983aebb18ed431bac732df73013e4835228d4.scope - libcontainer container 3da2437193b055ed620eedec146983aebb18ed431bac732df73013e4835228d4. Nov 24 06:57:07.954949 containerd[1621]: time="2025-11-24T06:57:07.954918901Z" level=info msg="connecting to shim 64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed" address="unix:///run/containerd/s/b8ae5c24624981d4b4b08de3c8d2489ff3496be7da1073768e43844058dbb269" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:07.977231 containerd[1621]: time="2025-11-24T06:57:07.977180693Z" level=info msg="StartContainer for \"3da2437193b055ed620eedec146983aebb18ed431bac732df73013e4835228d4\" returns successfully" Nov 24 06:57:07.993985 systemd[1]: Started cri-containerd-64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed.scope - libcontainer container 64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed. Nov 24 06:57:08.017974 containerd[1621]: time="2025-11-24T06:57:08.017941808Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-swsqx,Uid:3079118d-9876-4056-a671-92f88f5f8c3d,Namespace:calico-system,Attempt:0,} returns sandbox id \"c32e0873e8237d06a51ef1a2ddbc6265d8f4ba73e9c5c7cccb9be32352a56796\"" Nov 24 06:57:08.023083 containerd[1621]: time="2025-11-24T06:57:08.023056596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Nov 24 06:57:08.058312 containerd[1621]: time="2025-11-24T06:57:08.058260971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c65846d8b-n55vh,Uid:77c9c1c2-ff92-4d21-b427-835b49d2e048,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"deb6c5f5466770671ad6926eccdc92ca20061565481a1987bd232e7612125774\"" Nov 24 06:57:08.060942 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:08.139099 containerd[1621]: time="2025-11-24T06:57:08.138442849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6757d5779b-2xfwn,Uid:8655a062-8ee8-4565-9af9-1c36ab263987,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"64a99cac532220b56151bfa4d6c013fbdc12590c17d5d788c7c168afaef80eed\"" Nov 24 06:57:08.382246 containerd[1621]: time="2025-11-24T06:57:08.382157138Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:08.383415 containerd[1621]: time="2025-11-24T06:57:08.383363190Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Nov 24 06:57:08.383725 containerd[1621]: time="2025-11-24T06:57:08.383417592Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:08.383794 kubelet[2932]: E1124 06:57:08.383726 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:57:08.383794 kubelet[2932]: E1124 06:57:08.383761 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:57:08.383998 kubelet[2932]: E1124 06:57:08.383940 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wctht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-swsqx_calico-system(3079118d-9876-4056-a671-92f88f5f8c3d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:08.384426 containerd[1621]: time="2025-11-24T06:57:08.384336512Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:08.389474 kubelet[2932]: E1124 06:57:08.385380 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:57:08.503538 kubelet[2932]: E1124 06:57:08.503464 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:57:08.583636 kubelet[2932]: I1124 06:57:08.579775 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-gkfn4" podStartSLOduration=40.554429788 podStartE2EDuration="40.554429788s" podCreationTimestamp="2025-11-24 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:57:08.529188572 +0000 UTC m=+47.544260374" watchObservedRunningTime="2025-11-24 06:57:08.554429788 +0000 UTC m=+47.569501596" Nov 24 06:57:08.584157 kubelet[2932]: I1124 06:57:08.583957 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-5jnz6" podStartSLOduration=40.583946214 podStartE2EDuration="40.583946214s" podCreationTimestamp="2025-11-24 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-24 06:57:08.583860813 +0000 UTC m=+47.598932617" watchObservedRunningTime="2025-11-24 06:57:08.583946214 +0000 UTC m=+47.599018021" Nov 24 06:57:08.724418 containerd[1621]: time="2025-11-24T06:57:08.724221292Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:08.724861 containerd[1621]: time="2025-11-24T06:57:08.724843860Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:08.724945 containerd[1621]: time="2025-11-24T06:57:08.724900198Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:08.725168 kubelet[2932]: E1124 06:57:08.725065 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:08.725168 kubelet[2932]: E1124 06:57:08.725112 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:08.725324 kubelet[2932]: E1124 06:57:08.725280 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-74s7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7c65846d8b-n55vh_calico-apiserver(77c9c1c2-ff92-4d21-b427-835b49d2e048): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:08.725594 containerd[1621]: time="2025-11-24T06:57:08.725566957Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:08.726742 kubelet[2932]: E1124 06:57:08.726705 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:57:08.828749 systemd-networkd[1502]: cali2b3b243f52d: Gained IPv6LL Nov 24 06:57:09.015836 containerd[1621]: time="2025-11-24T06:57:09.015741538Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:09.019586 containerd[1621]: time="2025-11-24T06:57:09.019550255Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:09.019691 containerd[1621]: time="2025-11-24T06:57:09.019612065Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:09.019742 kubelet[2932]: E1124 06:57:09.019705 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:09.019742 kubelet[2932]: E1124 06:57:09.019739 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:09.019849 kubelet[2932]: E1124 06:57:09.019820 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9ck7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6757d5779b-2xfwn_calico-apiserver(8655a062-8ee8-4565-9af9-1c36ab263987): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:09.021121 kubelet[2932]: E1124 06:57:09.021087 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:57:09.084706 systemd-networkd[1502]: cali710e0cb15cb: Gained IPv6LL Nov 24 06:57:09.108008 containerd[1621]: time="2025-11-24T06:57:09.107899163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79b4855f45-7htjs,Uid:e9872522-30cd-4303-b0f4-9d477ec17bc5,Namespace:calico-system,Attempt:0,}" Nov 24 06:57:09.200001 systemd-networkd[1502]: cali035f2a6f893: Link UP Nov 24 06:57:09.200103 systemd-networkd[1502]: cali035f2a6f893: Gained carrier Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.152 [INFO][5047] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0 calico-kube-controllers-79b4855f45- calico-system e9872522-30cd-4303-b0f4-9d477ec17bc5 836 0 2025-11-24 06:56:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:79b4855f45 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-79b4855f45-7htjs eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali035f2a6f893 [] [] }} ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Namespace="calico-system" Pod="calico-kube-controllers-79b4855f45-7htjs" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.152 [INFO][5047] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Namespace="calico-system" Pod="calico-kube-controllers-79b4855f45-7htjs" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.172 [INFO][5060] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" HandleID="k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Workload="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.173 [INFO][5060] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" HandleID="k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Workload="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f130), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-79b4855f45-7htjs", "timestamp":"2025-11-24 06:57:09.172181739 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.173 [INFO][5060] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.173 [INFO][5060] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.173 [INFO][5060] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.178 [INFO][5060] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.180 [INFO][5060] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.183 [INFO][5060] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.184 [INFO][5060] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.186 [INFO][5060] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.187 [INFO][5060] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.188 [INFO][5060] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.191 [INFO][5060] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.195 [INFO][5060] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.196 [INFO][5060] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" host="localhost" Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.196 [INFO][5060] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Nov 24 06:57:09.212642 containerd[1621]: 2025-11-24 06:57:09.196 [INFO][5060] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" HandleID="k8s-pod-network.8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Workload="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" Nov 24 06:57:09.215188 containerd[1621]: 2025-11-24 06:57:09.198 [INFO][5047] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Namespace="calico-system" Pod="calico-kube-controllers-79b4855f45-7htjs" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0", GenerateName:"calico-kube-controllers-79b4855f45-", Namespace:"calico-system", SelfLink:"", UID:"e9872522-30cd-4303-b0f4-9d477ec17bc5", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79b4855f45", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-79b4855f45-7htjs", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali035f2a6f893", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:09.215188 containerd[1621]: 2025-11-24 06:57:09.198 [INFO][5047] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Namespace="calico-system" Pod="calico-kube-controllers-79b4855f45-7htjs" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" Nov 24 06:57:09.215188 containerd[1621]: 2025-11-24 06:57:09.198 [INFO][5047] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali035f2a6f893 ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Namespace="calico-system" Pod="calico-kube-controllers-79b4855f45-7htjs" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" Nov 24 06:57:09.215188 containerd[1621]: 2025-11-24 06:57:09.200 [INFO][5047] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Namespace="calico-system" Pod="calico-kube-controllers-79b4855f45-7htjs" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" Nov 24 06:57:09.215188 containerd[1621]: 2025-11-24 06:57:09.200 [INFO][5047] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Namespace="calico-system" Pod="calico-kube-controllers-79b4855f45-7htjs" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0", GenerateName:"calico-kube-controllers-79b4855f45-", Namespace:"calico-system", SelfLink:"", UID:"e9872522-30cd-4303-b0f4-9d477ec17bc5", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.November, 24, 6, 56, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79b4855f45", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb", Pod:"calico-kube-controllers-79b4855f45-7htjs", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali035f2a6f893", MAC:"6e:22:b2:dd:a7:55", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Nov 24 06:57:09.215188 containerd[1621]: 2025-11-24 06:57:09.210 [INFO][5047] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" Namespace="calico-system" Pod="calico-kube-controllers-79b4855f45-7htjs" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--79b4855f45--7htjs-eth0" Nov 24 06:57:09.241379 containerd[1621]: time="2025-11-24T06:57:09.241343090Z" level=info msg="connecting to shim 8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb" address="unix:///run/containerd/s/cd81098547984d2dd63024ba57310a8abe730123b704ffb22f263b3e6209db41" namespace=k8s.io protocol=ttrpc version=3 Nov 24 06:57:09.264783 systemd[1]: Started cri-containerd-8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb.scope - libcontainer container 8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb. Nov 24 06:57:09.274296 systemd-resolved[1503]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Nov 24 06:57:09.276714 systemd-networkd[1502]: calid8cd48d8e8e: Gained IPv6LL Nov 24 06:57:09.300152 containerd[1621]: time="2025-11-24T06:57:09.300127693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79b4855f45-7htjs,Uid:e9872522-30cd-4303-b0f4-9d477ec17bc5,Namespace:calico-system,Attempt:0,} returns sandbox id \"8e06635dc7fcfba2227a0aa571f6d42bae9ce64fe6a1ad33b251975fab03fdbb\"" Nov 24 06:57:09.302097 containerd[1621]: time="2025-11-24T06:57:09.302069066Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Nov 24 06:57:09.340726 systemd-networkd[1502]: cali8b984ee59b6: Gained IPv6LL Nov 24 06:57:09.514884 kubelet[2932]: E1124 06:57:09.514234 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:57:09.514884 kubelet[2932]: E1124 06:57:09.514235 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:57:09.514884 kubelet[2932]: E1124 06:57:09.514396 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:57:09.640232 containerd[1621]: time="2025-11-24T06:57:09.640070852Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:09.640762 containerd[1621]: time="2025-11-24T06:57:09.640685323Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Nov 24 06:57:09.640762 containerd[1621]: time="2025-11-24T06:57:09.640736306Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Nov 24 06:57:09.641127 kubelet[2932]: E1124 06:57:09.640956 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:57:09.641127 kubelet[2932]: E1124 06:57:09.640988 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:57:09.641127 kubelet[2932]: E1124 06:57:09.641086 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qzjhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79b4855f45-7htjs_calico-system(e9872522-30cd-4303-b0f4-9d477ec17bc5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:09.642545 kubelet[2932]: E1124 06:57:09.642462 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:57:09.724763 systemd-networkd[1502]: cali50646ab4607: Gained IPv6LL Nov 24 06:57:10.516022 kubelet[2932]: E1124 06:57:10.515976 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:57:10.557092 systemd-networkd[1502]: cali035f2a6f893: Gained IPv6LL Nov 24 06:57:17.106432 containerd[1621]: time="2025-11-24T06:57:17.106390527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Nov 24 06:57:17.457552 containerd[1621]: time="2025-11-24T06:57:17.457457600Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:17.457802 containerd[1621]: time="2025-11-24T06:57:17.457781941Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Nov 24 06:57:17.457844 containerd[1621]: time="2025-11-24T06:57:17.457827677Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Nov 24 06:57:17.458058 kubelet[2932]: E1124 06:57:17.458023 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:57:17.458256 kubelet[2932]: E1124 06:57:17.458069 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:57:17.460543 kubelet[2932]: E1124 06:57:17.458181 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f83afd19fc7845bbac116113552ede1f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8x6zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d558dc64d-9gf9z_calico-system(d262c589-dcbf-4568-b396-12186ab1a67f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:17.462760 containerd[1621]: time="2025-11-24T06:57:17.462733193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Nov 24 06:57:17.804694 containerd[1621]: time="2025-11-24T06:57:17.804526098Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:17.818513 containerd[1621]: time="2025-11-24T06:57:17.818469711Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Nov 24 06:57:17.818776 containerd[1621]: time="2025-11-24T06:57:17.818518182Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Nov 24 06:57:17.818977 kubelet[2932]: E1124 06:57:17.818908 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:57:17.818977 kubelet[2932]: E1124 06:57:17.818945 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:57:17.819153 kubelet[2932]: E1124 06:57:17.819125 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8x6zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d558dc64d-9gf9z_calico-system(d262c589-dcbf-4568-b396-12186ab1a67f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:17.821017 kubelet[2932]: E1124 06:57:17.820982 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:57:18.105601 containerd[1621]: time="2025-11-24T06:57:18.105513985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:18.432650 containerd[1621]: time="2025-11-24T06:57:18.432558256Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:18.432976 containerd[1621]: time="2025-11-24T06:57:18.432957461Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:18.433206 containerd[1621]: time="2025-11-24T06:57:18.433009614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:18.433530 kubelet[2932]: E1124 06:57:18.433101 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:18.433530 kubelet[2932]: E1124 06:57:18.433152 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:18.433530 kubelet[2932]: E1124 06:57:18.433243 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kszgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7c65846d8b-csbwd_calico-apiserver(56a1253d-b0f7-4032-98a1-7eca8d8f6d62): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:18.434744 kubelet[2932]: E1124 06:57:18.434728 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:57:21.181754 containerd[1621]: time="2025-11-24T06:57:21.181722476Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Nov 24 06:57:21.549214 containerd[1621]: time="2025-11-24T06:57:21.549074808Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:21.553075 containerd[1621]: time="2025-11-24T06:57:21.553051367Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Nov 24 06:57:21.553170 containerd[1621]: time="2025-11-24T06:57:21.553062229Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:21.553297 kubelet[2932]: E1124 06:57:21.553271 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:57:21.553704 kubelet[2932]: E1124 06:57:21.553505 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:57:21.553819 containerd[1621]: time="2025-11-24T06:57:21.553684699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Nov 24 06:57:21.583657 kubelet[2932]: E1124 06:57:21.583582 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wctht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-swsqx_calico-system(3079118d-9876-4056-a671-92f88f5f8c3d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:21.584828 kubelet[2932]: E1124 06:57:21.584802 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:57:21.850217 containerd[1621]: time="2025-11-24T06:57:21.850068349Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:21.857189 containerd[1621]: time="2025-11-24T06:57:21.857094118Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Nov 24 06:57:21.857189 containerd[1621]: time="2025-11-24T06:57:21.857160532Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Nov 24 06:57:21.857315 kubelet[2932]: E1124 06:57:21.857266 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:57:21.857315 kubelet[2932]: E1124 06:57:21.857304 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:57:21.857415 kubelet[2932]: E1124 06:57:21.857385 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqh2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:21.859433 containerd[1621]: time="2025-11-24T06:57:21.859421088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Nov 24 06:57:22.225442 containerd[1621]: time="2025-11-24T06:57:22.225340722Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:22.226035 containerd[1621]: time="2025-11-24T06:57:22.225760571Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Nov 24 06:57:22.226035 containerd[1621]: time="2025-11-24T06:57:22.225798069Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Nov 24 06:57:22.226117 kubelet[2932]: E1124 06:57:22.225924 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:57:22.226117 kubelet[2932]: E1124 06:57:22.225965 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:57:22.226310 kubelet[2932]: E1124 06:57:22.226219 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqh2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:22.226528 containerd[1621]: time="2025-11-24T06:57:22.226512606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:22.227712 kubelet[2932]: E1124 06:57:22.227603 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:57:22.567136 containerd[1621]: time="2025-11-24T06:57:22.567101299Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:22.567527 containerd[1621]: time="2025-11-24T06:57:22.567426285Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:22.567527 containerd[1621]: time="2025-11-24T06:57:22.567485879Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:22.567842 kubelet[2932]: E1124 06:57:22.567584 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:22.567842 kubelet[2932]: E1124 06:57:22.567613 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:22.567842 kubelet[2932]: E1124 06:57:22.567781 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9ck7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6757d5779b-2xfwn_calico-apiserver(8655a062-8ee8-4565-9af9-1c36ab263987): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:22.568547 containerd[1621]: time="2025-11-24T06:57:22.567937280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Nov 24 06:57:22.569432 kubelet[2932]: E1124 06:57:22.569410 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:57:22.919805 containerd[1621]: time="2025-11-24T06:57:22.919556050Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:22.924739 containerd[1621]: time="2025-11-24T06:57:22.924711473Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Nov 24 06:57:22.924830 containerd[1621]: time="2025-11-24T06:57:22.924766475Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Nov 24 06:57:22.924894 kubelet[2932]: E1124 06:57:22.924870 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:57:22.924929 kubelet[2932]: E1124 06:57:22.924902 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:57:22.925034 kubelet[2932]: E1124 06:57:22.924989 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qzjhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79b4855f45-7htjs_calico-system(e9872522-30cd-4303-b0f4-9d477ec17bc5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:22.926363 kubelet[2932]: E1124 06:57:22.926257 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:57:25.105406 containerd[1621]: time="2025-11-24T06:57:25.105383031Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:25.431295 containerd[1621]: time="2025-11-24T06:57:25.431132154Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:25.431678 containerd[1621]: time="2025-11-24T06:57:25.431424379Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:25.431678 containerd[1621]: time="2025-11-24T06:57:25.431466394Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:25.431738 kubelet[2932]: E1124 06:57:25.431575 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:25.431738 kubelet[2932]: E1124 06:57:25.431606 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:25.431738 kubelet[2932]: E1124 06:57:25.431708 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-74s7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7c65846d8b-n55vh_calico-apiserver(77c9c1c2-ff92-4d21-b427-835b49d2e048): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:25.433052 kubelet[2932]: E1124 06:57:25.433010 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:57:30.106079 kubelet[2932]: E1124 06:57:30.106018 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:57:30.106896 kubelet[2932]: E1124 06:57:30.106234 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:57:33.105754 kubelet[2932]: E1124 06:57:33.105715 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:57:34.105120 kubelet[2932]: E1124 06:57:34.104874 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:57:35.106378 kubelet[2932]: E1124 06:57:35.106312 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:57:37.106557 kubelet[2932]: E1124 06:57:37.106497 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:57:38.130164 kubelet[2932]: E1124 06:57:38.129922 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:57:43.106918 containerd[1621]: time="2025-11-24T06:57:43.106249676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Nov 24 06:57:43.436384 containerd[1621]: time="2025-11-24T06:57:43.436168237Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:43.436675 containerd[1621]: time="2025-11-24T06:57:43.436652764Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Nov 24 06:57:43.436722 containerd[1621]: time="2025-11-24T06:57:43.436702993Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Nov 24 06:57:43.437737 kubelet[2932]: E1124 06:57:43.437712 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:57:43.438390 kubelet[2932]: E1124 06:57:43.437949 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:57:43.438390 kubelet[2932]: E1124 06:57:43.438128 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f83afd19fc7845bbac116113552ede1f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8x6zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d558dc64d-9gf9z_calico-system(d262c589-dcbf-4568-b396-12186ab1a67f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:43.438875 containerd[1621]: time="2025-11-24T06:57:43.438654500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:43.799053 containerd[1621]: time="2025-11-24T06:57:43.799015938Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:43.799521 containerd[1621]: time="2025-11-24T06:57:43.799497617Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:43.799576 containerd[1621]: time="2025-11-24T06:57:43.799561621Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:43.800108 kubelet[2932]: E1124 06:57:43.799711 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:43.800108 kubelet[2932]: E1124 06:57:43.799747 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:43.800108 kubelet[2932]: E1124 06:57:43.799895 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kszgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7c65846d8b-csbwd_calico-apiserver(56a1253d-b0f7-4032-98a1-7eca8d8f6d62): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:43.800807 containerd[1621]: time="2025-11-24T06:57:43.800415554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Nov 24 06:57:43.801198 kubelet[2932]: E1124 06:57:43.801078 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:57:44.119570 containerd[1621]: time="2025-11-24T06:57:44.119494298Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:44.120258 containerd[1621]: time="2025-11-24T06:57:44.119845020Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Nov 24 06:57:44.120258 containerd[1621]: time="2025-11-24T06:57:44.119897421Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Nov 24 06:57:44.120320 kubelet[2932]: E1124 06:57:44.119971 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:57:44.120320 kubelet[2932]: E1124 06:57:44.120014 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:57:44.120369 containerd[1621]: time="2025-11-24T06:57:44.120264536Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Nov 24 06:57:44.120562 kubelet[2932]: E1124 06:57:44.120219 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8x6zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d558dc64d-9gf9z_calico-system(d262c589-dcbf-4568-b396-12186ab1a67f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:44.121584 kubelet[2932]: E1124 06:57:44.121562 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:57:44.429057 containerd[1621]: time="2025-11-24T06:57:44.428951107Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:44.429606 containerd[1621]: time="2025-11-24T06:57:44.429557177Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Nov 24 06:57:44.429672 containerd[1621]: time="2025-11-24T06:57:44.429655825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:44.429851 kubelet[2932]: E1124 06:57:44.429788 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:57:44.429851 kubelet[2932]: E1124 06:57:44.429844 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:57:44.430311 kubelet[2932]: E1124 06:57:44.430098 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wctht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-swsqx_calico-system(3079118d-9876-4056-a671-92f88f5f8c3d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:44.431497 kubelet[2932]: E1124 06:57:44.431481 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:57:47.106171 containerd[1621]: time="2025-11-24T06:57:47.105737970Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Nov 24 06:57:47.445783 containerd[1621]: time="2025-11-24T06:57:47.445709401Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:47.446311 containerd[1621]: time="2025-11-24T06:57:47.446293982Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Nov 24 06:57:47.446369 containerd[1621]: time="2025-11-24T06:57:47.446351282Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Nov 24 06:57:47.446512 kubelet[2932]: E1124 06:57:47.446484 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:57:47.447352 kubelet[2932]: E1124 06:57:47.446519 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:57:47.447352 kubelet[2932]: E1124 06:57:47.446600 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqh2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:47.448119 containerd[1621]: time="2025-11-24T06:57:47.448088992Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Nov 24 06:57:47.510363 systemd[1]: Started sshd@7-139.178.70.102:22-147.75.109.163:50880.service - OpenSSH per-connection server daemon (147.75.109.163:50880). Nov 24 06:57:47.601221 sshd[5203]: Accepted publickey for core from 147.75.109.163 port 50880 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:57:47.603084 sshd-session[5203]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:57:47.609667 systemd-logind[1591]: New session 10 of user core. Nov 24 06:57:47.614921 systemd[1]: Started session-10.scope - Session 10 of User core. Nov 24 06:57:47.806343 containerd[1621]: time="2025-11-24T06:57:47.806313364Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:47.806580 containerd[1621]: time="2025-11-24T06:57:47.806562316Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Nov 24 06:57:47.806671 containerd[1621]: time="2025-11-24T06:57:47.806600766Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Nov 24 06:57:47.806991 kubelet[2932]: E1124 06:57:47.806857 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:57:47.806991 kubelet[2932]: E1124 06:57:47.806931 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:57:47.807689 kubelet[2932]: E1124 06:57:47.807655 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqh2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:47.808888 kubelet[2932]: E1124 06:57:47.808855 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:57:48.100968 sshd[5212]: Connection closed by 147.75.109.163 port 50880 Nov 24 06:57:48.100907 sshd-session[5203]: pam_unix(sshd:session): session closed for user core Nov 24 06:57:48.107192 systemd[1]: sshd@7-139.178.70.102:22-147.75.109.163:50880.service: Deactivated successfully. Nov 24 06:57:48.108729 systemd-logind[1591]: Session 10 logged out. Waiting for processes to exit. Nov 24 06:57:48.108870 systemd[1]: session-10.scope: Deactivated successfully. Nov 24 06:57:48.109764 systemd-logind[1591]: Removed session 10. Nov 24 06:57:48.111218 containerd[1621]: time="2025-11-24T06:57:48.111194106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Nov 24 06:57:48.453174 containerd[1621]: time="2025-11-24T06:57:48.453096844Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:48.453529 containerd[1621]: time="2025-11-24T06:57:48.453505795Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Nov 24 06:57:48.453595 containerd[1621]: time="2025-11-24T06:57:48.453579236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Nov 24 06:57:48.453765 kubelet[2932]: E1124 06:57:48.453743 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:57:48.453994 kubelet[2932]: E1124 06:57:48.453776 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:57:48.453994 kubelet[2932]: E1124 06:57:48.453863 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qzjhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79b4855f45-7htjs_calico-system(e9872522-30cd-4303-b0f4-9d477ec17bc5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:48.455467 kubelet[2932]: E1124 06:57:48.455440 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:57:49.111821 containerd[1621]: time="2025-11-24T06:57:49.111794281Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:49.486754 containerd[1621]: time="2025-11-24T06:57:49.486397659Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:49.486960 containerd[1621]: time="2025-11-24T06:57:49.486939951Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:49.487091 containerd[1621]: time="2025-11-24T06:57:49.487002688Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:49.487200 kubelet[2932]: E1124 06:57:49.487168 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:49.487406 kubelet[2932]: E1124 06:57:49.487208 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:49.487406 kubelet[2932]: E1124 06:57:49.487294 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9ck7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6757d5779b-2xfwn_calico-apiserver(8655a062-8ee8-4565-9af9-1c36ab263987): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:49.489118 kubelet[2932]: E1124 06:57:49.489025 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:57:53.106670 containerd[1621]: time="2025-11-24T06:57:53.106453545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:57:53.111545 systemd[1]: Started sshd@8-139.178.70.102:22-147.75.109.163:52274.service - OpenSSH per-connection server daemon (147.75.109.163:52274). Nov 24 06:57:53.172381 sshd[5227]: Accepted publickey for core from 147.75.109.163 port 52274 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:57:53.173181 sshd-session[5227]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:57:53.175831 systemd-logind[1591]: New session 11 of user core. Nov 24 06:57:53.183767 systemd[1]: Started session-11.scope - Session 11 of User core. Nov 24 06:57:53.357102 sshd[5230]: Connection closed by 147.75.109.163 port 52274 Nov 24 06:57:53.357552 sshd-session[5227]: pam_unix(sshd:session): session closed for user core Nov 24 06:57:53.359477 systemd[1]: sshd@8-139.178.70.102:22-147.75.109.163:52274.service: Deactivated successfully. Nov 24 06:57:53.360894 systemd[1]: session-11.scope: Deactivated successfully. Nov 24 06:57:53.362044 systemd-logind[1591]: Session 11 logged out. Waiting for processes to exit. Nov 24 06:57:53.362976 systemd-logind[1591]: Removed session 11. Nov 24 06:57:53.446677 containerd[1621]: time="2025-11-24T06:57:53.446553254Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:57:53.448353 containerd[1621]: time="2025-11-24T06:57:53.448284804Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:57:53.448403 containerd[1621]: time="2025-11-24T06:57:53.448345370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:57:53.448503 kubelet[2932]: E1124 06:57:53.448477 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:53.448733 kubelet[2932]: E1124 06:57:53.448512 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:57:53.448804 kubelet[2932]: E1124 06:57:53.448778 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-74s7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7c65846d8b-n55vh_calico-apiserver(77c9c1c2-ff92-4d21-b427-835b49d2e048): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:57:53.449906 kubelet[2932]: E1124 06:57:53.449883 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:57:55.107524 kubelet[2932]: E1124 06:57:55.107492 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:57:57.107166 kubelet[2932]: E1124 06:57:57.106891 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:57:58.370986 systemd[1]: Started sshd@9-139.178.70.102:22-147.75.109.163:52278.service - OpenSSH per-connection server daemon (147.75.109.163:52278). Nov 24 06:57:58.484081 sshd[5245]: Accepted publickey for core from 147.75.109.163 port 52278 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:57:58.491217 sshd-session[5245]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:57:58.496506 systemd-logind[1591]: New session 12 of user core. Nov 24 06:57:58.503707 systemd[1]: Started session-12.scope - Session 12 of User core. Nov 24 06:57:58.616954 sshd[5248]: Connection closed by 147.75.109.163 port 52278 Nov 24 06:57:58.617140 sshd-session[5245]: pam_unix(sshd:session): session closed for user core Nov 24 06:57:58.625508 systemd[1]: sshd@9-139.178.70.102:22-147.75.109.163:52278.service: Deactivated successfully. Nov 24 06:57:58.626822 systemd[1]: session-12.scope: Deactivated successfully. Nov 24 06:57:58.627420 systemd-logind[1591]: Session 12 logged out. Waiting for processes to exit. Nov 24 06:57:58.629261 systemd[1]: Started sshd@10-139.178.70.102:22-147.75.109.163:52280.service - OpenSSH per-connection server daemon (147.75.109.163:52280). Nov 24 06:57:58.630246 systemd-logind[1591]: Removed session 12. Nov 24 06:57:58.688025 sshd[5261]: Accepted publickey for core from 147.75.109.163 port 52280 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:57:58.689030 sshd-session[5261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:57:58.695762 systemd-logind[1591]: New session 13 of user core. Nov 24 06:57:58.699717 systemd[1]: Started session-13.scope - Session 13 of User core. Nov 24 06:57:58.883271 sshd[5264]: Connection closed by 147.75.109.163 port 52280 Nov 24 06:57:58.884462 sshd-session[5261]: pam_unix(sshd:session): session closed for user core Nov 24 06:57:58.892415 systemd[1]: sshd@10-139.178.70.102:22-147.75.109.163:52280.service: Deactivated successfully. Nov 24 06:57:58.895150 systemd[1]: session-13.scope: Deactivated successfully. Nov 24 06:57:58.896744 systemd-logind[1591]: Session 13 logged out. Waiting for processes to exit. Nov 24 06:57:58.898748 systemd-logind[1591]: Removed session 13. Nov 24 06:57:58.900881 systemd[1]: Started sshd@11-139.178.70.102:22-147.75.109.163:52292.service - OpenSSH per-connection server daemon (147.75.109.163:52292). Nov 24 06:57:58.975540 sshd[5274]: Accepted publickey for core from 147.75.109.163 port 52292 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:57:58.976889 sshd-session[5274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:57:58.980360 systemd-logind[1591]: New session 14 of user core. Nov 24 06:57:58.984713 systemd[1]: Started session-14.scope - Session 14 of User core. Nov 24 06:57:59.102509 sshd[5277]: Connection closed by 147.75.109.163 port 52292 Nov 24 06:57:59.103131 sshd-session[5274]: pam_unix(sshd:session): session closed for user core Nov 24 06:57:59.106584 kubelet[2932]: E1124 06:57:59.106561 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:57:59.107609 kubelet[2932]: E1124 06:57:59.107588 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:57:59.108283 systemd[1]: sshd@11-139.178.70.102:22-147.75.109.163:52292.service: Deactivated successfully. Nov 24 06:57:59.110348 systemd[1]: session-14.scope: Deactivated successfully. Nov 24 06:57:59.111891 systemd-logind[1591]: Session 14 logged out. Waiting for processes to exit. Nov 24 06:57:59.112944 systemd-logind[1591]: Removed session 14. Nov 24 06:58:01.106494 kubelet[2932]: E1124 06:58:01.106453 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:58:04.106074 kubelet[2932]: E1124 06:58:04.106043 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:58:04.112849 systemd[1]: Started sshd@12-139.178.70.102:22-147.75.109.163:47964.service - OpenSSH per-connection server daemon (147.75.109.163:47964). Nov 24 06:58:04.412347 sshd[5295]: Accepted publickey for core from 147.75.109.163 port 47964 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:04.414132 sshd-session[5295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:04.417315 systemd-logind[1591]: New session 15 of user core. Nov 24 06:58:04.423846 systemd[1]: Started session-15.scope - Session 15 of User core. Nov 24 06:58:04.562951 sshd[5298]: Connection closed by 147.75.109.163 port 47964 Nov 24 06:58:04.563001 sshd-session[5295]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:04.569450 systemd[1]: sshd@12-139.178.70.102:22-147.75.109.163:47964.service: Deactivated successfully. Nov 24 06:58:04.571302 systemd[1]: session-15.scope: Deactivated successfully. Nov 24 06:58:04.572446 systemd-logind[1591]: Session 15 logged out. Waiting for processes to exit. Nov 24 06:58:04.573464 systemd-logind[1591]: Removed session 15. Nov 24 06:58:07.106478 kubelet[2932]: E1124 06:58:07.106436 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:58:09.572523 systemd[1]: Started sshd@13-139.178.70.102:22-147.75.109.163:47980.service - OpenSSH per-connection server daemon (147.75.109.163:47980). Nov 24 06:58:09.644182 sshd[5337]: Accepted publickey for core from 147.75.109.163 port 47980 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:09.648208 sshd-session[5337]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:09.654450 systemd-logind[1591]: New session 16 of user core. Nov 24 06:58:09.660710 systemd[1]: Started session-16.scope - Session 16 of User core. Nov 24 06:58:09.811914 sshd[5340]: Connection closed by 147.75.109.163 port 47980 Nov 24 06:58:09.812398 sshd-session[5337]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:09.815188 systemd-logind[1591]: Session 16 logged out. Waiting for processes to exit. Nov 24 06:58:09.815264 systemd[1]: sshd@13-139.178.70.102:22-147.75.109.163:47980.service: Deactivated successfully. Nov 24 06:58:09.816414 systemd[1]: session-16.scope: Deactivated successfully. Nov 24 06:58:09.819575 systemd-logind[1591]: Removed session 16. Nov 24 06:58:10.107000 kubelet[2932]: E1124 06:58:10.106386 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:58:10.107858 kubelet[2932]: E1124 06:58:10.107421 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:58:10.108355 kubelet[2932]: E1124 06:58:10.108282 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:58:13.107723 kubelet[2932]: E1124 06:58:13.107687 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:58:14.828281 systemd[1]: Started sshd@14-139.178.70.102:22-147.75.109.163:60022.service - OpenSSH per-connection server daemon (147.75.109.163:60022). Nov 24 06:58:14.881537 sshd[5352]: Accepted publickey for core from 147.75.109.163 port 60022 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:14.882401 sshd-session[5352]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:14.887678 systemd-logind[1591]: New session 17 of user core. Nov 24 06:58:14.892791 systemd[1]: Started session-17.scope - Session 17 of User core. Nov 24 06:58:15.107981 kubelet[2932]: E1124 06:58:15.107842 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:58:15.368739 sshd[5355]: Connection closed by 147.75.109.163 port 60022 Nov 24 06:58:15.410896 sshd-session[5352]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:15.458353 systemd[1]: sshd@14-139.178.70.102:22-147.75.109.163:60022.service: Deactivated successfully. Nov 24 06:58:15.459645 systemd[1]: session-17.scope: Deactivated successfully. Nov 24 06:58:15.460232 systemd-logind[1591]: Session 17 logged out. Waiting for processes to exit. Nov 24 06:58:15.461022 systemd-logind[1591]: Removed session 17. Nov 24 06:58:18.105788 kubelet[2932]: E1124 06:58:18.105716 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:58:20.381015 systemd[1]: Started sshd@15-139.178.70.102:22-147.75.109.163:60038.service - OpenSSH per-connection server daemon (147.75.109.163:60038). Nov 24 06:58:20.480591 sshd[5366]: Accepted publickey for core from 147.75.109.163 port 60038 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:20.481532 sshd-session[5366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:20.484989 systemd-logind[1591]: New session 18 of user core. Nov 24 06:58:20.496768 systemd[1]: Started session-18.scope - Session 18 of User core. Nov 24 06:58:20.596716 sshd[5369]: Connection closed by 147.75.109.163 port 60038 Nov 24 06:58:20.605058 systemd[1]: sshd@15-139.178.70.102:22-147.75.109.163:60038.service: Deactivated successfully. Nov 24 06:58:20.597830 sshd-session[5366]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:20.606119 systemd[1]: session-18.scope: Deactivated successfully. Nov 24 06:58:20.606656 systemd-logind[1591]: Session 18 logged out. Waiting for processes to exit. Nov 24 06:58:20.608659 systemd[1]: Started sshd@16-139.178.70.102:22-147.75.109.163:60042.service - OpenSSH per-connection server daemon (147.75.109.163:60042). Nov 24 06:58:20.610512 systemd-logind[1591]: Removed session 18. Nov 24 06:58:20.649055 sshd[5380]: Accepted publickey for core from 147.75.109.163 port 60042 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:20.650573 sshd-session[5380]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:20.655036 systemd-logind[1591]: New session 19 of user core. Nov 24 06:58:20.658708 systemd[1]: Started session-19.scope - Session 19 of User core. Nov 24 06:58:21.038843 sshd[5383]: Connection closed by 147.75.109.163 port 60042 Nov 24 06:58:21.038489 sshd-session[5380]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:21.047140 systemd[1]: sshd@16-139.178.70.102:22-147.75.109.163:60042.service: Deactivated successfully. Nov 24 06:58:21.048553 systemd[1]: session-19.scope: Deactivated successfully. Nov 24 06:58:21.049311 systemd-logind[1591]: Session 19 logged out. Waiting for processes to exit. Nov 24 06:58:21.051386 systemd[1]: Started sshd@17-139.178.70.102:22-147.75.109.163:59600.service - OpenSSH per-connection server daemon (147.75.109.163:59600). Nov 24 06:58:21.052934 systemd-logind[1591]: Removed session 19. Nov 24 06:58:21.106951 kubelet[2932]: E1124 06:58:21.106924 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:58:21.138201 kubelet[2932]: E1124 06:58:21.138163 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:58:21.142696 sshd[5393]: Accepted publickey for core from 147.75.109.163 port 59600 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:21.143564 sshd-session[5393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:21.150822 systemd-logind[1591]: New session 20 of user core. Nov 24 06:58:21.156719 systemd[1]: Started session-20.scope - Session 20 of User core. Nov 24 06:58:21.743840 sshd[5398]: Connection closed by 147.75.109.163 port 59600 Nov 24 06:58:21.748912 sshd-session[5393]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:21.760759 systemd[1]: Started sshd@18-139.178.70.102:22-147.75.109.163:59610.service - OpenSSH per-connection server daemon (147.75.109.163:59610). Nov 24 06:58:21.761239 systemd[1]: sshd@17-139.178.70.102:22-147.75.109.163:59600.service: Deactivated successfully. Nov 24 06:58:21.762998 systemd[1]: session-20.scope: Deactivated successfully. Nov 24 06:58:21.765839 systemd-logind[1591]: Session 20 logged out. Waiting for processes to exit. Nov 24 06:58:21.769061 systemd-logind[1591]: Removed session 20. Nov 24 06:58:21.838058 sshd[5408]: Accepted publickey for core from 147.75.109.163 port 59610 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:21.839306 sshd-session[5408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:21.842956 systemd-logind[1591]: New session 21 of user core. Nov 24 06:58:21.851701 systemd[1]: Started session-21.scope - Session 21 of User core. Nov 24 06:58:22.069555 sshd[5418]: Connection closed by 147.75.109.163 port 59610 Nov 24 06:58:22.069740 sshd-session[5408]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:22.077008 systemd[1]: sshd@18-139.178.70.102:22-147.75.109.163:59610.service: Deactivated successfully. Nov 24 06:58:22.078856 systemd[1]: session-21.scope: Deactivated successfully. Nov 24 06:58:22.081538 systemd-logind[1591]: Session 21 logged out. Waiting for processes to exit. Nov 24 06:58:22.084453 systemd[1]: Started sshd@19-139.178.70.102:22-147.75.109.163:59620.service - OpenSSH per-connection server daemon (147.75.109.163:59620). Nov 24 06:58:22.086136 systemd-logind[1591]: Removed session 21. Nov 24 06:58:22.140673 sshd[5427]: Accepted publickey for core from 147.75.109.163 port 59620 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:22.141734 sshd-session[5427]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:22.144407 systemd-logind[1591]: New session 22 of user core. Nov 24 06:58:22.151849 systemd[1]: Started session-22.scope - Session 22 of User core. Nov 24 06:58:22.247310 sshd[5430]: Connection closed by 147.75.109.163 port 59620 Nov 24 06:58:22.247665 sshd-session[5427]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:22.250892 systemd[1]: sshd@19-139.178.70.102:22-147.75.109.163:59620.service: Deactivated successfully. Nov 24 06:58:22.252478 systemd[1]: session-22.scope: Deactivated successfully. Nov 24 06:58:22.253645 systemd-logind[1591]: Session 22 logged out. Waiting for processes to exit. Nov 24 06:58:22.254253 systemd-logind[1591]: Removed session 22. Nov 24 06:58:23.106441 kubelet[2932]: E1124 06:58:23.106413 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:58:24.105773 containerd[1621]: time="2025-11-24T06:58:24.105750252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:58:24.436361 containerd[1621]: time="2025-11-24T06:58:24.436278512Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:24.436682 containerd[1621]: time="2025-11-24T06:58:24.436662306Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:58:24.436750 containerd[1621]: time="2025-11-24T06:58:24.436719824Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:58:24.436889 kubelet[2932]: E1124 06:58:24.436854 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:58:24.437081 kubelet[2932]: E1124 06:58:24.436898 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:58:24.437081 kubelet[2932]: E1124 06:58:24.437021 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kszgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7c65846d8b-csbwd_calico-apiserver(56a1253d-b0f7-4032-98a1-7eca8d8f6d62): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:24.438334 kubelet[2932]: E1124 06:58:24.438300 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:58:27.258911 systemd[1]: Started sshd@20-139.178.70.102:22-147.75.109.163:59626.service - OpenSSH per-connection server daemon (147.75.109.163:59626). Nov 24 06:58:27.364519 sshd[5451]: Accepted publickey for core from 147.75.109.163 port 59626 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:27.366590 sshd-session[5451]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:27.371011 systemd-logind[1591]: New session 23 of user core. Nov 24 06:58:27.373767 systemd[1]: Started session-23.scope - Session 23 of User core. Nov 24 06:58:27.637100 sshd[5454]: Connection closed by 147.75.109.163 port 59626 Nov 24 06:58:27.637728 sshd-session[5451]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:27.641364 systemd-logind[1591]: Session 23 logged out. Waiting for processes to exit. Nov 24 06:58:27.641538 systemd[1]: sshd@20-139.178.70.102:22-147.75.109.163:59626.service: Deactivated successfully. Nov 24 06:58:27.643400 systemd[1]: session-23.scope: Deactivated successfully. Nov 24 06:58:27.647068 systemd-logind[1591]: Removed session 23. Nov 24 06:58:28.105468 kubelet[2932]: E1124 06:58:28.105440 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5" Nov 24 06:58:28.106005 containerd[1621]: time="2025-11-24T06:58:28.105783082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Nov 24 06:58:28.443591 containerd[1621]: time="2025-11-24T06:58:28.443432182Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:28.444240 containerd[1621]: time="2025-11-24T06:58:28.444204457Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Nov 24 06:58:28.444370 containerd[1621]: time="2025-11-24T06:58:28.444283261Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Nov 24 06:58:28.444731 kubelet[2932]: E1124 06:58:28.444522 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:58:28.444731 kubelet[2932]: E1124 06:58:28.444566 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Nov 24 06:58:28.444958 kubelet[2932]: E1124 06:58:28.444917 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqh2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:28.446864 containerd[1621]: time="2025-11-24T06:58:28.446837473Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Nov 24 06:58:28.803859 containerd[1621]: time="2025-11-24T06:58:28.803723756Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:28.808872 containerd[1621]: time="2025-11-24T06:58:28.808850043Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Nov 24 06:58:28.809001 containerd[1621]: time="2025-11-24T06:58:28.808916821Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Nov 24 06:58:28.809269 kubelet[2932]: E1124 06:58:28.809112 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:58:28.809269 kubelet[2932]: E1124 06:58:28.809151 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Nov 24 06:58:28.809269 kubelet[2932]: E1124 06:58:28.809240 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqh2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hsxdh_calico-system(ec589a89-1333-4d00-aa6a-417830a62536): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:28.810508 kubelet[2932]: E1124 06:58:28.810484 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hsxdh" podUID="ec589a89-1333-4d00-aa6a-417830a62536" Nov 24 06:58:30.106006 containerd[1621]: time="2025-11-24T06:58:30.105697302Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:58:30.415456 containerd[1621]: time="2025-11-24T06:58:30.415048773Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:30.423770 containerd[1621]: time="2025-11-24T06:58:30.423694214Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:58:30.423770 containerd[1621]: time="2025-11-24T06:58:30.423749665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:58:30.423924 kubelet[2932]: E1124 06:58:30.423849 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:58:30.423924 kubelet[2932]: E1124 06:58:30.423883 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:58:30.424248 kubelet[2932]: E1124 06:58:30.423972 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9ck7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6757d5779b-2xfwn_calico-apiserver(8655a062-8ee8-4565-9af9-1c36ab263987): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:30.425338 kubelet[2932]: E1124 06:58:30.425301 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6757d5779b-2xfwn" podUID="8655a062-8ee8-4565-9af9-1c36ab263987" Nov 24 06:58:32.106511 containerd[1621]: time="2025-11-24T06:58:32.106442189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Nov 24 06:58:32.606768 containerd[1621]: time="2025-11-24T06:58:32.606735864Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:32.607129 containerd[1621]: time="2025-11-24T06:58:32.607107925Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Nov 24 06:58:32.607202 containerd[1621]: time="2025-11-24T06:58:32.607186775Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Nov 24 06:58:32.607378 kubelet[2932]: E1124 06:58:32.607350 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:58:32.607590 kubelet[2932]: E1124 06:58:32.607386 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Nov 24 06:58:32.607590 kubelet[2932]: E1124 06:58:32.607470 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:f83afd19fc7845bbac116113552ede1f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8x6zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d558dc64d-9gf9z_calico-system(d262c589-dcbf-4568-b396-12186ab1a67f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:32.610193 containerd[1621]: time="2025-11-24T06:58:32.610167144Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Nov 24 06:58:32.648145 systemd[1]: Started sshd@21-139.178.70.102:22-147.75.109.163:47302.service - OpenSSH per-connection server daemon (147.75.109.163:47302). Nov 24 06:58:32.812820 sshd[5470]: Accepted publickey for core from 147.75.109.163 port 47302 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:32.813839 sshd-session[5470]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:32.819525 systemd-logind[1591]: New session 24 of user core. Nov 24 06:58:32.823760 systemd[1]: Started session-24.scope - Session 24 of User core. Nov 24 06:58:32.933914 containerd[1621]: time="2025-11-24T06:58:32.933724770Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:32.936813 containerd[1621]: time="2025-11-24T06:58:32.936500879Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Nov 24 06:58:32.936813 containerd[1621]: time="2025-11-24T06:58:32.936526911Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Nov 24 06:58:32.937065 kubelet[2932]: E1124 06:58:32.937031 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:58:32.937118 kubelet[2932]: E1124 06:58:32.937067 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Nov 24 06:58:32.937189 kubelet[2932]: E1124 06:58:32.937157 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8x6zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d558dc64d-9gf9z_calico-system(d262c589-dcbf-4568-b396-12186ab1a67f): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:32.939725 kubelet[2932]: E1124 06:58:32.939685 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d558dc64d-9gf9z" podUID="d262c589-dcbf-4568-b396-12186ab1a67f" Nov 24 06:58:33.005719 sshd[5473]: Connection closed by 147.75.109.163 port 47302 Nov 24 06:58:33.008197 systemd[1]: sshd@21-139.178.70.102:22-147.75.109.163:47302.service: Deactivated successfully. Nov 24 06:58:33.006156 sshd-session[5470]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:33.009273 systemd[1]: session-24.scope: Deactivated successfully. Nov 24 06:58:33.009817 systemd-logind[1591]: Session 24 logged out. Waiting for processes to exit. Nov 24 06:58:33.010938 systemd-logind[1591]: Removed session 24. Nov 24 06:58:34.105003 containerd[1621]: time="2025-11-24T06:58:34.104902330Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Nov 24 06:58:34.464501 containerd[1621]: time="2025-11-24T06:58:34.464293106Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:34.466785 containerd[1621]: time="2025-11-24T06:58:34.466764956Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Nov 24 06:58:34.466834 containerd[1621]: time="2025-11-24T06:58:34.466817815Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Nov 24 06:58:34.466933 kubelet[2932]: E1124 06:58:34.466906 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:58:34.467160 kubelet[2932]: E1124 06:58:34.466940 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Nov 24 06:58:34.467160 kubelet[2932]: E1124 06:58:34.467028 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-74s7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7c65846d8b-n55vh_calico-apiserver(77c9c1c2-ff92-4d21-b427-835b49d2e048): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:34.468190 kubelet[2932]: E1124 06:58:34.468155 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-n55vh" podUID="77c9c1c2-ff92-4d21-b427-835b49d2e048" Nov 24 06:58:36.104633 kubelet[2932]: E1124 06:58:36.104597 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7c65846d8b-csbwd" podUID="56a1253d-b0f7-4032-98a1-7eca8d8f6d62" Nov 24 06:58:38.017293 systemd[1]: Started sshd@22-139.178.70.102:22-147.75.109.163:47316.service - OpenSSH per-connection server daemon (147.75.109.163:47316). Nov 24 06:58:38.105245 containerd[1621]: time="2025-11-24T06:58:38.105215952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Nov 24 06:58:38.107259 sshd[5513]: Accepted publickey for core from 147.75.109.163 port 47316 ssh2: RSA SHA256:RwhYujxUqnPbIXvpU/VEdpztLb6ySvEnhik5HMZEY24 Nov 24 06:58:38.108738 sshd-session[5513]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Nov 24 06:58:38.111774 systemd-logind[1591]: New session 25 of user core. Nov 24 06:58:38.116727 systemd[1]: Started session-25.scope - Session 25 of User core. Nov 24 06:58:38.433764 sshd[5516]: Connection closed by 147.75.109.163 port 47316 Nov 24 06:58:38.434110 sshd-session[5513]: pam_unix(sshd:session): session closed for user core Nov 24 06:58:38.436957 systemd[1]: sshd@22-139.178.70.102:22-147.75.109.163:47316.service: Deactivated successfully. Nov 24 06:58:38.437940 systemd[1]: session-25.scope: Deactivated successfully. Nov 24 06:58:38.439771 systemd-logind[1591]: Session 25 logged out. Waiting for processes to exit. Nov 24 06:58:38.441897 systemd-logind[1591]: Removed session 25. Nov 24 06:58:38.444723 containerd[1621]: time="2025-11-24T06:58:38.444700746Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:38.444982 containerd[1621]: time="2025-11-24T06:58:38.444964218Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Nov 24 06:58:38.445022 containerd[1621]: time="2025-11-24T06:58:38.445010357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Nov 24 06:58:38.445124 kubelet[2932]: E1124 06:58:38.445099 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:58:38.445311 kubelet[2932]: E1124 06:58:38.445132 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Nov 24 06:58:38.445311 kubelet[2932]: E1124 06:58:38.445225 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wctht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-swsqx_calico-system(3079118d-9876-4056-a671-92f88f5f8c3d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:38.446566 kubelet[2932]: E1124 06:58:38.446544 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-swsqx" podUID="3079118d-9876-4056-a671-92f88f5f8c3d" Nov 24 06:58:40.105422 containerd[1621]: time="2025-11-24T06:58:40.105358842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Nov 24 06:58:40.421288 containerd[1621]: time="2025-11-24T06:58:40.421202222Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Nov 24 06:58:40.421564 containerd[1621]: time="2025-11-24T06:58:40.421535872Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Nov 24 06:58:40.421798 containerd[1621]: time="2025-11-24T06:58:40.421607164Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Nov 24 06:58:40.421836 kubelet[2932]: E1124 06:58:40.421743 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:58:40.421836 kubelet[2932]: E1124 06:58:40.421773 2932 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Nov 24 06:58:40.422030 kubelet[2932]: E1124 06:58:40.421887 2932 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qzjhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79b4855f45-7htjs_calico-system(e9872522-30cd-4303-b0f4-9d477ec17bc5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Nov 24 06:58:40.424630 kubelet[2932]: E1124 06:58:40.423187 2932 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79b4855f45-7htjs" podUID="e9872522-30cd-4303-b0f4-9d477ec17bc5"