Dec 12 18:34:15.708969 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:21:28 -00 2025 Dec 12 18:34:15.708985 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 18:34:15.708992 kernel: Disabled fast string operations Dec 12 18:34:15.708996 kernel: BIOS-provided physical RAM map: Dec 12 18:34:15.709006 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Dec 12 18:34:15.709013 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Dec 12 18:34:15.709018 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Dec 12 18:34:15.709025 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Dec 12 18:34:15.709029 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Dec 12 18:34:15.709033 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Dec 12 18:34:15.709037 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Dec 12 18:34:15.709042 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Dec 12 18:34:15.709046 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Dec 12 18:34:15.709050 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Dec 12 18:34:15.709057 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Dec 12 18:34:15.709062 kernel: NX (Execute Disable) protection: active Dec 12 18:34:15.709067 kernel: APIC: Static calls initialized Dec 12 18:34:15.709072 kernel: SMBIOS 2.7 present. Dec 12 18:34:15.709077 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Dec 12 18:34:15.709082 kernel: DMI: Memory slots populated: 1/128 Dec 12 18:34:15.709086 kernel: vmware: hypercall mode: 0x00 Dec 12 18:34:15.709091 kernel: Hypervisor detected: VMware Dec 12 18:34:15.709096 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Dec 12 18:34:15.709102 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Dec 12 18:34:15.709107 kernel: vmware: using clock offset of 4248220551 ns Dec 12 18:34:15.709111 kernel: tsc: Detected 3408.000 MHz processor Dec 12 18:34:15.709117 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 12 18:34:15.709125 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 12 18:34:15.709131 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Dec 12 18:34:15.709136 kernel: total RAM covered: 3072M Dec 12 18:34:15.709140 kernel: Found optimal setting for mtrr clean up Dec 12 18:34:15.709146 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Dec 12 18:34:15.709151 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Dec 12 18:34:15.709158 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 12 18:34:15.709163 kernel: Using GB pages for direct mapping Dec 12 18:34:15.709168 kernel: ACPI: Early table checksum verification disabled Dec 12 18:34:15.709175 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Dec 12 18:34:15.709180 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Dec 12 18:34:15.709185 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Dec 12 18:34:15.709190 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Dec 12 18:34:15.709197 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Dec 12 18:34:15.709203 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Dec 12 18:34:15.709208 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Dec 12 18:34:15.709213 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Dec 12 18:34:15.709218 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Dec 12 18:34:15.709224 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Dec 12 18:34:15.709229 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Dec 12 18:34:15.709235 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Dec 12 18:34:15.709240 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Dec 12 18:34:15.709246 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Dec 12 18:34:15.709251 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Dec 12 18:34:15.709256 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Dec 12 18:34:15.709261 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Dec 12 18:34:15.709266 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Dec 12 18:34:15.709272 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Dec 12 18:34:15.709277 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Dec 12 18:34:15.709283 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Dec 12 18:34:15.709288 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Dec 12 18:34:15.709293 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Dec 12 18:34:15.709298 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Dec 12 18:34:15.709303 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Dec 12 18:34:15.709309 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Dec 12 18:34:15.709314 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Dec 12 18:34:15.709323 kernel: Zone ranges: Dec 12 18:34:15.709329 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 12 18:34:15.709334 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Dec 12 18:34:15.709340 kernel: Normal empty Dec 12 18:34:15.709345 kernel: Device empty Dec 12 18:34:15.709350 kernel: Movable zone start for each node Dec 12 18:34:15.709355 kernel: Early memory node ranges Dec 12 18:34:15.709360 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Dec 12 18:34:15.709365 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Dec 12 18:34:15.709371 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Dec 12 18:34:15.709376 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Dec 12 18:34:15.709381 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 12 18:34:15.709387 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Dec 12 18:34:15.709392 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Dec 12 18:34:15.709397 kernel: ACPI: PM-Timer IO Port: 0x1008 Dec 12 18:34:15.709402 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Dec 12 18:34:15.709407 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Dec 12 18:34:15.709413 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Dec 12 18:34:15.709418 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Dec 12 18:34:15.709423 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Dec 12 18:34:15.709428 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Dec 12 18:34:15.709433 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Dec 12 18:34:15.709439 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Dec 12 18:34:15.709444 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Dec 12 18:34:15.709449 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Dec 12 18:34:15.709454 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Dec 12 18:34:15.709459 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Dec 12 18:34:15.709465 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Dec 12 18:34:15.709470 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Dec 12 18:34:15.709474 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Dec 12 18:34:15.709480 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Dec 12 18:34:15.709485 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Dec 12 18:34:15.709491 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Dec 12 18:34:15.709496 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Dec 12 18:34:15.709501 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Dec 12 18:34:15.709506 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Dec 12 18:34:15.709512 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Dec 12 18:34:15.709517 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Dec 12 18:34:15.709522 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Dec 12 18:34:15.709527 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Dec 12 18:34:15.709532 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Dec 12 18:34:15.709538 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Dec 12 18:34:15.709543 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Dec 12 18:34:15.709548 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Dec 12 18:34:15.709624 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Dec 12 18:34:15.709630 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Dec 12 18:34:15.709635 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Dec 12 18:34:15.709640 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Dec 12 18:34:15.709646 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Dec 12 18:34:15.709651 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Dec 12 18:34:15.709656 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Dec 12 18:34:15.709663 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Dec 12 18:34:15.709668 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Dec 12 18:34:15.709673 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Dec 12 18:34:15.709678 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Dec 12 18:34:15.709687 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Dec 12 18:34:15.709693 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Dec 12 18:34:15.709699 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Dec 12 18:34:15.709704 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Dec 12 18:34:15.709711 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Dec 12 18:34:15.709716 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Dec 12 18:34:15.709722 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Dec 12 18:34:15.709727 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Dec 12 18:34:15.709732 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Dec 12 18:34:15.709738 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Dec 12 18:34:15.709743 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Dec 12 18:34:15.709749 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Dec 12 18:34:15.709754 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Dec 12 18:34:15.709760 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Dec 12 18:34:15.709766 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Dec 12 18:34:15.709771 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Dec 12 18:34:15.709777 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Dec 12 18:34:15.709782 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Dec 12 18:34:15.709788 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Dec 12 18:34:15.709793 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Dec 12 18:34:15.709799 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Dec 12 18:34:15.709804 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Dec 12 18:34:15.709810 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Dec 12 18:34:15.709815 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Dec 12 18:34:15.709822 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Dec 12 18:34:15.709827 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Dec 12 18:34:15.709833 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Dec 12 18:34:15.709838 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Dec 12 18:34:15.709843 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Dec 12 18:34:15.709849 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Dec 12 18:34:15.709854 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Dec 12 18:34:15.709859 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Dec 12 18:34:15.709865 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Dec 12 18:34:15.709870 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Dec 12 18:34:15.709877 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Dec 12 18:34:15.709882 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Dec 12 18:34:15.709887 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Dec 12 18:34:15.709893 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Dec 12 18:34:15.709898 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Dec 12 18:34:15.709904 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Dec 12 18:34:15.709909 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Dec 12 18:34:15.709914 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Dec 12 18:34:15.709920 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Dec 12 18:34:15.709925 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Dec 12 18:34:15.709932 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Dec 12 18:34:15.709937 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Dec 12 18:34:15.709943 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Dec 12 18:34:15.709948 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Dec 12 18:34:15.709954 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Dec 12 18:34:15.709959 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Dec 12 18:34:15.709965 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Dec 12 18:34:15.709970 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Dec 12 18:34:15.709975 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Dec 12 18:34:15.709982 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Dec 12 18:34:15.709987 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Dec 12 18:34:15.709993 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Dec 12 18:34:15.709998 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Dec 12 18:34:15.710004 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Dec 12 18:34:15.710009 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Dec 12 18:34:15.710014 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Dec 12 18:34:15.710020 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Dec 12 18:34:15.710025 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Dec 12 18:34:15.710031 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Dec 12 18:34:15.710037 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Dec 12 18:34:15.710043 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Dec 12 18:34:15.710048 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Dec 12 18:34:15.710053 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Dec 12 18:34:15.710059 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Dec 12 18:34:15.710064 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Dec 12 18:34:15.710070 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Dec 12 18:34:15.710075 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Dec 12 18:34:15.710081 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Dec 12 18:34:15.710086 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Dec 12 18:34:15.710093 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Dec 12 18:34:15.710098 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Dec 12 18:34:15.710104 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Dec 12 18:34:15.710109 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Dec 12 18:34:15.710115 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Dec 12 18:34:15.710120 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Dec 12 18:34:15.710126 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Dec 12 18:34:15.710131 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Dec 12 18:34:15.710136 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Dec 12 18:34:15.710142 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Dec 12 18:34:15.710148 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Dec 12 18:34:15.710154 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Dec 12 18:34:15.710159 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Dec 12 18:34:15.710164 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Dec 12 18:34:15.710170 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Dec 12 18:34:15.710175 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Dec 12 18:34:15.710181 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Dec 12 18:34:15.710186 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 12 18:34:15.710192 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Dec 12 18:34:15.710198 kernel: TSC deadline timer available Dec 12 18:34:15.710204 kernel: CPU topo: Max. logical packages: 128 Dec 12 18:34:15.710209 kernel: CPU topo: Max. logical dies: 128 Dec 12 18:34:15.710215 kernel: CPU topo: Max. dies per package: 1 Dec 12 18:34:15.710220 kernel: CPU topo: Max. threads per core: 1 Dec 12 18:34:15.710225 kernel: CPU topo: Num. cores per package: 1 Dec 12 18:34:15.710231 kernel: CPU topo: Num. threads per package: 1 Dec 12 18:34:15.710236 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Dec 12 18:34:15.710242 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Dec 12 18:34:15.710247 kernel: Booting paravirtualized kernel on VMware hypervisor Dec 12 18:34:15.710254 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 12 18:34:15.710259 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Dec 12 18:34:15.710265 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Dec 12 18:34:15.710271 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Dec 12 18:34:15.710276 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Dec 12 18:34:15.710281 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Dec 12 18:34:15.710287 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Dec 12 18:34:15.710292 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Dec 12 18:34:15.710298 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Dec 12 18:34:15.710304 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Dec 12 18:34:15.710309 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Dec 12 18:34:15.710315 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Dec 12 18:34:15.710320 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Dec 12 18:34:15.710325 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Dec 12 18:34:15.710332 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Dec 12 18:34:15.710340 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Dec 12 18:34:15.710345 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Dec 12 18:34:15.710352 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Dec 12 18:34:15.710357 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Dec 12 18:34:15.710362 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Dec 12 18:34:15.710369 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 18:34:15.710374 kernel: random: crng init done Dec 12 18:34:15.710380 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Dec 12 18:34:15.710385 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Dec 12 18:34:15.710391 kernel: printk: log_buf_len min size: 262144 bytes Dec 12 18:34:15.710396 kernel: printk: log_buf_len: 1048576 bytes Dec 12 18:34:15.710403 kernel: printk: early log buf free: 245704(93%) Dec 12 18:34:15.710408 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 18:34:15.710414 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 12 18:34:15.710419 kernel: Fallback order for Node 0: 0 Dec 12 18:34:15.710425 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Dec 12 18:34:15.710431 kernel: Policy zone: DMA32 Dec 12 18:34:15.710436 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 18:34:15.710442 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Dec 12 18:34:15.710447 kernel: ftrace: allocating 40103 entries in 157 pages Dec 12 18:34:15.710454 kernel: ftrace: allocated 157 pages with 5 groups Dec 12 18:34:15.710459 kernel: Dynamic Preempt: voluntary Dec 12 18:34:15.710465 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 18:34:15.710470 kernel: rcu: RCU event tracing is enabled. Dec 12 18:34:15.710476 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Dec 12 18:34:15.710482 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 18:34:15.710487 kernel: Rude variant of Tasks RCU enabled. Dec 12 18:34:15.710493 kernel: Tracing variant of Tasks RCU enabled. Dec 12 18:34:15.710498 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 18:34:15.710505 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Dec 12 18:34:15.710510 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Dec 12 18:34:15.710516 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Dec 12 18:34:15.710522 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Dec 12 18:34:15.710527 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Dec 12 18:34:15.710533 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Dec 12 18:34:15.710538 kernel: Console: colour VGA+ 80x25 Dec 12 18:34:15.710544 kernel: printk: legacy console [tty0] enabled Dec 12 18:34:15.710549 kernel: printk: legacy console [ttyS0] enabled Dec 12 18:34:15.710570 kernel: ACPI: Core revision 20240827 Dec 12 18:34:15.710576 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Dec 12 18:34:15.710582 kernel: APIC: Switch to symmetric I/O mode setup Dec 12 18:34:15.710587 kernel: x2apic enabled Dec 12 18:34:15.710593 kernel: APIC: Switched APIC routing to: physical x2apic Dec 12 18:34:15.710599 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Dec 12 18:34:15.710604 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Dec 12 18:34:15.710610 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Dec 12 18:34:15.710615 kernel: Disabled fast string operations Dec 12 18:34:15.710621 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Dec 12 18:34:15.710628 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Dec 12 18:34:15.710633 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 12 18:34:15.710639 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Dec 12 18:34:15.710644 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Dec 12 18:34:15.710650 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Dec 12 18:34:15.710656 kernel: RETBleed: Mitigation: Enhanced IBRS Dec 12 18:34:15.710661 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 12 18:34:15.710667 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 12 18:34:15.710673 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Dec 12 18:34:15.710679 kernel: SRBDS: Unknown: Dependent on hypervisor status Dec 12 18:34:15.710685 kernel: GDS: Unknown: Dependent on hypervisor status Dec 12 18:34:15.710690 kernel: active return thunk: its_return_thunk Dec 12 18:34:15.710696 kernel: ITS: Mitigation: Aligned branch/return thunks Dec 12 18:34:15.710701 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 12 18:34:15.710706 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 12 18:34:15.710712 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 12 18:34:15.710717 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 12 18:34:15.710724 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Dec 12 18:34:15.710730 kernel: Freeing SMP alternatives memory: 32K Dec 12 18:34:15.710735 kernel: pid_max: default: 131072 minimum: 1024 Dec 12 18:34:15.710741 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 18:34:15.710746 kernel: landlock: Up and running. Dec 12 18:34:15.710752 kernel: SELinux: Initializing. Dec 12 18:34:15.710758 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 12 18:34:15.710763 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 12 18:34:15.710769 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Dec 12 18:34:15.710776 kernel: Performance Events: Skylake events, core PMU driver. Dec 12 18:34:15.710781 kernel: core: CPUID marked event: 'cpu cycles' unavailable Dec 12 18:34:15.710787 kernel: core: CPUID marked event: 'instructions' unavailable Dec 12 18:34:15.710792 kernel: core: CPUID marked event: 'bus cycles' unavailable Dec 12 18:34:15.710798 kernel: core: CPUID marked event: 'cache references' unavailable Dec 12 18:34:15.710803 kernel: core: CPUID marked event: 'cache misses' unavailable Dec 12 18:34:15.710808 kernel: core: CPUID marked event: 'branch instructions' unavailable Dec 12 18:34:15.710814 kernel: core: CPUID marked event: 'branch misses' unavailable Dec 12 18:34:15.710819 kernel: ... version: 1 Dec 12 18:34:15.710826 kernel: ... bit width: 48 Dec 12 18:34:15.710831 kernel: ... generic registers: 4 Dec 12 18:34:15.710837 kernel: ... value mask: 0000ffffffffffff Dec 12 18:34:15.710842 kernel: ... max period: 000000007fffffff Dec 12 18:34:15.710848 kernel: ... fixed-purpose events: 0 Dec 12 18:34:15.710853 kernel: ... event mask: 000000000000000f Dec 12 18:34:15.710859 kernel: signal: max sigframe size: 1776 Dec 12 18:34:15.710864 kernel: rcu: Hierarchical SRCU implementation. Dec 12 18:34:15.710871 kernel: rcu: Max phase no-delay instances is 400. Dec 12 18:34:15.710877 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Dec 12 18:34:15.710883 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Dec 12 18:34:15.710888 kernel: smp: Bringing up secondary CPUs ... Dec 12 18:34:15.710894 kernel: smpboot: x86: Booting SMP configuration: Dec 12 18:34:15.710899 kernel: .... node #0, CPUs: #1 Dec 12 18:34:15.710905 kernel: Disabled fast string operations Dec 12 18:34:15.710910 kernel: smp: Brought up 1 node, 2 CPUs Dec 12 18:34:15.710915 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Dec 12 18:34:15.710921 kernel: Memory: 1916068K/2096628K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46188K init, 2572K bss, 169180K reserved, 0K cma-reserved) Dec 12 18:34:15.710928 kernel: devtmpfs: initialized Dec 12 18:34:15.710934 kernel: x86/mm: Memory block size: 128MB Dec 12 18:34:15.710939 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Dec 12 18:34:15.710945 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 18:34:15.710950 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Dec 12 18:34:15.710956 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 18:34:15.710961 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 18:34:15.710967 kernel: audit: initializing netlink subsys (disabled) Dec 12 18:34:15.710973 kernel: audit: type=2000 audit(1765564452.288:1): state=initialized audit_enabled=0 res=1 Dec 12 18:34:15.710979 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 18:34:15.710985 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 12 18:34:15.710990 kernel: cpuidle: using governor menu Dec 12 18:34:15.710996 kernel: Simple Boot Flag at 0x36 set to 0x80 Dec 12 18:34:15.711001 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 18:34:15.711007 kernel: dca service started, version 1.12.1 Dec 12 18:34:15.711013 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Dec 12 18:34:15.711025 kernel: PCI: Using configuration type 1 for base access Dec 12 18:34:15.711032 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 12 18:34:15.711039 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 18:34:15.711045 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 18:34:15.711051 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 18:34:15.711057 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 18:34:15.711062 kernel: ACPI: Added _OSI(Module Device) Dec 12 18:34:15.711068 kernel: ACPI: Added _OSI(Processor Device) Dec 12 18:34:15.711074 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 18:34:15.711080 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 18:34:15.711085 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Dec 12 18:34:15.711092 kernel: ACPI: Interpreter enabled Dec 12 18:34:15.711098 kernel: ACPI: PM: (supports S0 S1 S5) Dec 12 18:34:15.711104 kernel: ACPI: Using IOAPIC for interrupt routing Dec 12 18:34:15.711109 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 12 18:34:15.711115 kernel: PCI: Using E820 reservations for host bridge windows Dec 12 18:34:15.711121 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Dec 12 18:34:15.711127 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Dec 12 18:34:15.711208 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 12 18:34:15.711263 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Dec 12 18:34:15.711312 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Dec 12 18:34:15.711320 kernel: PCI host bridge to bus 0000:00 Dec 12 18:34:15.711371 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 12 18:34:15.711416 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Dec 12 18:34:15.711459 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Dec 12 18:34:15.711501 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 12 18:34:15.711546 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Dec 12 18:34:15.711615 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Dec 12 18:34:15.711691 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Dec 12 18:34:15.711752 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Dec 12 18:34:15.711803 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 18:34:15.711859 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Dec 12 18:34:15.711913 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Dec 12 18:34:15.711963 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Dec 12 18:34:15.712011 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Dec 12 18:34:15.712060 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Dec 12 18:34:15.712111 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Dec 12 18:34:15.712160 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Dec 12 18:34:15.712224 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Dec 12 18:34:15.712274 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Dec 12 18:34:15.713187 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Dec 12 18:34:15.713277 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Dec 12 18:34:15.713375 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Dec 12 18:34:15.713433 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Dec 12 18:34:15.713488 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Dec 12 18:34:15.713539 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Dec 12 18:34:15.713606 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Dec 12 18:34:15.713672 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Dec 12 18:34:15.713728 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Dec 12 18:34:15.713780 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 12 18:34:15.713833 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Dec 12 18:34:15.713883 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Dec 12 18:34:15.713932 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Dec 12 18:34:15.713980 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Dec 12 18:34:15.714029 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Dec 12 18:34:15.714083 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.714136 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Dec 12 18:34:15.714184 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Dec 12 18:34:15.714234 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Dec 12 18:34:15.714283 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.714343 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.714395 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Dec 12 18:34:15.714444 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Dec 12 18:34:15.714496 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Dec 12 18:34:15.714546 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Dec 12 18:34:15.714616 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.714670 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.714721 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Dec 12 18:34:15.714770 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Dec 12 18:34:15.714823 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Dec 12 18:34:15.714872 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Dec 12 18:34:15.714921 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.714975 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.715025 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Dec 12 18:34:15.715075 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Dec 12 18:34:15.715124 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Dec 12 18:34:15.715176 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.715230 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.715279 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Dec 12 18:34:15.715328 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Dec 12 18:34:15.715377 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Dec 12 18:34:15.715426 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.715481 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.715533 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Dec 12 18:34:15.715602 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Dec 12 18:34:15.715654 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Dec 12 18:34:15.715704 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.715758 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.715810 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Dec 12 18:34:15.715860 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Dec 12 18:34:15.715912 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Dec 12 18:34:15.715961 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.716014 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.716064 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Dec 12 18:34:15.716113 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Dec 12 18:34:15.716162 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Dec 12 18:34:15.716211 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.716264 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.716317 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Dec 12 18:34:15.716379 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Dec 12 18:34:15.716429 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Dec 12 18:34:15.716477 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.716533 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.716598 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Dec 12 18:34:15.716649 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Dec 12 18:34:15.716707 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Dec 12 18:34:15.716760 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Dec 12 18:34:15.718601 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.718666 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.718721 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Dec 12 18:34:15.718773 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Dec 12 18:34:15.718823 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Dec 12 18:34:15.718876 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Dec 12 18:34:15.718925 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.718979 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.719030 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Dec 12 18:34:15.719079 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Dec 12 18:34:15.719129 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Dec 12 18:34:15.719179 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.719235 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.719285 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Dec 12 18:34:15.719338 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Dec 12 18:34:15.719388 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Dec 12 18:34:15.719438 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.719492 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.719541 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Dec 12 18:34:15.719609 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Dec 12 18:34:15.719659 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Dec 12 18:34:15.719708 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.719764 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.719815 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Dec 12 18:34:15.719864 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Dec 12 18:34:15.719912 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Dec 12 18:34:15.719965 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.720019 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.720069 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Dec 12 18:34:15.720118 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Dec 12 18:34:15.720167 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Dec 12 18:34:15.720216 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.720269 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.720322 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Dec 12 18:34:15.720371 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Dec 12 18:34:15.720421 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Dec 12 18:34:15.720469 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Dec 12 18:34:15.720521 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.721759 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.721820 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Dec 12 18:34:15.721878 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Dec 12 18:34:15.721930 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Dec 12 18:34:15.721981 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Dec 12 18:34:15.722031 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.722091 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.722142 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Dec 12 18:34:15.722191 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Dec 12 18:34:15.722241 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Dec 12 18:34:15.722290 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Dec 12 18:34:15.722343 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.722398 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.722450 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Dec 12 18:34:15.722500 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Dec 12 18:34:15.722549 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Dec 12 18:34:15.722622 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.722676 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.722727 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Dec 12 18:34:15.722801 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Dec 12 18:34:15.722860 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Dec 12 18:34:15.722946 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.723005 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.723055 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Dec 12 18:34:15.723105 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Dec 12 18:34:15.723155 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Dec 12 18:34:15.723203 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.723258 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.723311 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Dec 12 18:34:15.723361 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Dec 12 18:34:15.723411 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Dec 12 18:34:15.723460 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.723513 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.723586 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Dec 12 18:34:15.723638 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Dec 12 18:34:15.723690 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Dec 12 18:34:15.723740 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.723794 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.723844 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Dec 12 18:34:15.723893 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Dec 12 18:34:15.723941 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Dec 12 18:34:15.723990 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Dec 12 18:34:15.724043 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.724099 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.724149 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Dec 12 18:34:15.724198 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Dec 12 18:34:15.724247 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Dec 12 18:34:15.724296 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Dec 12 18:34:15.724345 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.724404 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.724459 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Dec 12 18:34:15.724517 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Dec 12 18:34:15.724591 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Dec 12 18:34:15.724649 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.724706 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.724756 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Dec 12 18:34:15.724808 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Dec 12 18:34:15.724858 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Dec 12 18:34:15.724906 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.724960 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.725011 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Dec 12 18:34:15.725060 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Dec 12 18:34:15.725109 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Dec 12 18:34:15.725161 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.725215 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.725265 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Dec 12 18:34:15.725315 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Dec 12 18:34:15.725369 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Dec 12 18:34:15.725420 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.725474 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.725527 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Dec 12 18:34:15.725609 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Dec 12 18:34:15.725661 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Dec 12 18:34:15.725710 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.725764 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Dec 12 18:34:15.725813 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Dec 12 18:34:15.725862 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Dec 12 18:34:15.725914 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Dec 12 18:34:15.725963 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.726019 kernel: pci_bus 0000:01: extended config space not accessible Dec 12 18:34:15.726071 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 18:34:15.726125 kernel: pci_bus 0000:02: extended config space not accessible Dec 12 18:34:15.726134 kernel: acpiphp: Slot [32] registered Dec 12 18:34:15.726141 kernel: acpiphp: Slot [33] registered Dec 12 18:34:15.726147 kernel: acpiphp: Slot [34] registered Dec 12 18:34:15.726155 kernel: acpiphp: Slot [35] registered Dec 12 18:34:15.726160 kernel: acpiphp: Slot [36] registered Dec 12 18:34:15.726166 kernel: acpiphp: Slot [37] registered Dec 12 18:34:15.726172 kernel: acpiphp: Slot [38] registered Dec 12 18:34:15.726178 kernel: acpiphp: Slot [39] registered Dec 12 18:34:15.726184 kernel: acpiphp: Slot [40] registered Dec 12 18:34:15.726190 kernel: acpiphp: Slot [41] registered Dec 12 18:34:15.726195 kernel: acpiphp: Slot [42] registered Dec 12 18:34:15.726201 kernel: acpiphp: Slot [43] registered Dec 12 18:34:15.726207 kernel: acpiphp: Slot [44] registered Dec 12 18:34:15.726214 kernel: acpiphp: Slot [45] registered Dec 12 18:34:15.726220 kernel: acpiphp: Slot [46] registered Dec 12 18:34:15.726226 kernel: acpiphp: Slot [47] registered Dec 12 18:34:15.726231 kernel: acpiphp: Slot [48] registered Dec 12 18:34:15.726237 kernel: acpiphp: Slot [49] registered Dec 12 18:34:15.726243 kernel: acpiphp: Slot [50] registered Dec 12 18:34:15.726249 kernel: acpiphp: Slot [51] registered Dec 12 18:34:15.726255 kernel: acpiphp: Slot [52] registered Dec 12 18:34:15.726261 kernel: acpiphp: Slot [53] registered Dec 12 18:34:15.726268 kernel: acpiphp: Slot [54] registered Dec 12 18:34:15.726274 kernel: acpiphp: Slot [55] registered Dec 12 18:34:15.726279 kernel: acpiphp: Slot [56] registered Dec 12 18:34:15.726285 kernel: acpiphp: Slot [57] registered Dec 12 18:34:15.726291 kernel: acpiphp: Slot [58] registered Dec 12 18:34:15.726297 kernel: acpiphp: Slot [59] registered Dec 12 18:34:15.726303 kernel: acpiphp: Slot [60] registered Dec 12 18:34:15.726309 kernel: acpiphp: Slot [61] registered Dec 12 18:34:15.726314 kernel: acpiphp: Slot [62] registered Dec 12 18:34:15.726325 kernel: acpiphp: Slot [63] registered Dec 12 18:34:15.726378 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Dec 12 18:34:15.726428 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Dec 12 18:34:15.726477 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Dec 12 18:34:15.726526 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Dec 12 18:34:15.726594 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Dec 12 18:34:15.726655 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Dec 12 18:34:15.726727 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Dec 12 18:34:15.726792 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Dec 12 18:34:15.726857 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Dec 12 18:34:15.726923 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Dec 12 18:34:15.726990 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Dec 12 18:34:15.727055 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Dec 12 18:34:15.727123 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Dec 12 18:34:15.727177 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Dec 12 18:34:15.727231 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Dec 12 18:34:15.727284 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Dec 12 18:34:15.727335 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Dec 12 18:34:15.727386 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Dec 12 18:34:15.727438 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Dec 12 18:34:15.727490 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Dec 12 18:34:15.727545 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Dec 12 18:34:15.727615 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Dec 12 18:34:15.727667 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Dec 12 18:34:15.727717 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Dec 12 18:34:15.727767 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Dec 12 18:34:15.727818 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Dec 12 18:34:15.727868 kernel: pci 0000:0b:00.0: supports D1 D2 Dec 12 18:34:15.727919 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Dec 12 18:34:15.727975 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Dec 12 18:34:15.728029 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Dec 12 18:34:15.728080 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Dec 12 18:34:15.728136 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Dec 12 18:34:15.728194 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Dec 12 18:34:15.728245 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Dec 12 18:34:15.728295 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Dec 12 18:34:15.728349 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Dec 12 18:34:15.728402 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Dec 12 18:34:15.728452 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Dec 12 18:34:15.728501 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Dec 12 18:34:15.728559 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Dec 12 18:34:15.728615 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Dec 12 18:34:15.728665 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Dec 12 18:34:15.728714 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Dec 12 18:34:15.728764 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Dec 12 18:34:15.728816 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Dec 12 18:34:15.728866 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Dec 12 18:34:15.728915 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Dec 12 18:34:15.728964 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Dec 12 18:34:15.729014 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Dec 12 18:34:15.729063 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Dec 12 18:34:15.729113 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Dec 12 18:34:15.729165 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Dec 12 18:34:15.729214 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Dec 12 18:34:15.729224 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Dec 12 18:34:15.729230 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Dec 12 18:34:15.729236 kernel: ACPI: PCI: Interrupt link LNKB disabled Dec 12 18:34:15.729242 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 12 18:34:15.729248 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Dec 12 18:34:15.729254 kernel: iommu: Default domain type: Translated Dec 12 18:34:15.729262 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 12 18:34:15.729268 kernel: PCI: Using ACPI for IRQ routing Dec 12 18:34:15.729274 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 12 18:34:15.729280 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Dec 12 18:34:15.729286 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Dec 12 18:34:15.729340 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Dec 12 18:34:15.729389 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Dec 12 18:34:15.729439 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 12 18:34:15.729448 kernel: vgaarb: loaded Dec 12 18:34:15.729456 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Dec 12 18:34:15.729462 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Dec 12 18:34:15.729468 kernel: clocksource: Switched to clocksource tsc-early Dec 12 18:34:15.729474 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 18:34:15.729480 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 18:34:15.729486 kernel: pnp: PnP ACPI init Dec 12 18:34:15.729541 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Dec 12 18:34:15.729595 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Dec 12 18:34:15.729644 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Dec 12 18:34:15.729692 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Dec 12 18:34:15.729740 kernel: pnp 00:06: [dma 2] Dec 12 18:34:15.729790 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Dec 12 18:34:15.729835 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Dec 12 18:34:15.729879 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Dec 12 18:34:15.729887 kernel: pnp: PnP ACPI: found 8 devices Dec 12 18:34:15.729896 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 12 18:34:15.729902 kernel: NET: Registered PF_INET protocol family Dec 12 18:34:15.729908 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 18:34:15.729914 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Dec 12 18:34:15.729920 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 18:34:15.729926 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Dec 12 18:34:15.729932 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Dec 12 18:34:15.729938 kernel: TCP: Hash tables configured (established 16384 bind 16384) Dec 12 18:34:15.729945 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 12 18:34:15.729951 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 12 18:34:15.729957 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 18:34:15.729963 kernel: NET: Registered PF_XDP protocol family Dec 12 18:34:15.730011 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Dec 12 18:34:15.730061 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 12 18:34:15.730112 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 12 18:34:15.730161 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 12 18:34:15.730212 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 12 18:34:15.730264 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 12 18:34:15.730314 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Dec 12 18:34:15.730363 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 12 18:34:15.730423 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 12 18:34:15.730478 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 12 18:34:15.730529 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 12 18:34:15.730592 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 12 18:34:15.730645 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 12 18:34:15.730694 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 12 18:34:15.730743 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 12 18:34:15.730792 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 12 18:34:15.730841 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 12 18:34:15.730890 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 12 18:34:15.730939 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 12 18:34:15.730988 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Dec 12 18:34:15.731039 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Dec 12 18:34:15.731087 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Dec 12 18:34:15.731136 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Dec 12 18:34:15.731185 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Dec 12 18:34:15.731234 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Dec 12 18:34:15.731283 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.731331 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.731379 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.731430 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.731479 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.731528 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.731583 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.731632 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.731680 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.731729 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.731778 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.731829 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.731877 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.731926 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.731974 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.732023 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.732071 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.732120 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.732172 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.732371 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733065 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733128 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733182 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733232 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733282 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733333 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733387 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733437 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733487 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733535 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733600 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733650 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733699 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733748 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733799 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733848 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733898 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.733947 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.733995 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.734044 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.734092 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.734141 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.734192 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.734241 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.734290 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.734338 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.734387 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.734435 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.734484 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.736928 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.736997 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737062 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.737137 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737201 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.737263 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737314 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.737376 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737426 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.737481 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737538 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.737603 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737656 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.737705 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737754 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.737804 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737853 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.737902 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.737951 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738000 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.738048 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738100 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.738148 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738197 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.738246 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738298 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.738365 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738416 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.738464 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738513 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.738604 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738691 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.738740 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738789 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Dec 12 18:34:15.738838 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Dec 12 18:34:15.738888 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 12 18:34:15.738937 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Dec 12 18:34:15.738990 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Dec 12 18:34:15.739040 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Dec 12 18:34:15.739093 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Dec 12 18:34:15.739149 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Dec 12 18:34:15.739200 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Dec 12 18:34:15.739252 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Dec 12 18:34:15.739302 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Dec 12 18:34:15.739356 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Dec 12 18:34:15.739413 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Dec 12 18:34:15.739463 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Dec 12 18:34:15.739516 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Dec 12 18:34:15.739603 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Dec 12 18:34:15.739677 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Dec 12 18:34:15.739741 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Dec 12 18:34:15.739791 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Dec 12 18:34:15.739841 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Dec 12 18:34:15.739890 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Dec 12 18:34:15.739938 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Dec 12 18:34:15.739992 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Dec 12 18:34:15.740042 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Dec 12 18:34:15.740092 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Dec 12 18:34:15.740141 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Dec 12 18:34:15.740191 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Dec 12 18:34:15.740243 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Dec 12 18:34:15.740292 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Dec 12 18:34:15.740346 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Dec 12 18:34:15.740395 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Dec 12 18:34:15.740446 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Dec 12 18:34:15.740497 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Dec 12 18:34:15.740548 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Dec 12 18:34:15.740619 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Dec 12 18:34:15.740682 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Dec 12 18:34:15.740738 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Dec 12 18:34:15.740790 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Dec 12 18:34:15.740839 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Dec 12 18:34:15.740891 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Dec 12 18:34:15.740941 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Dec 12 18:34:15.740993 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Dec 12 18:34:15.741043 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Dec 12 18:34:15.741093 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Dec 12 18:34:15.741144 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Dec 12 18:34:15.741196 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Dec 12 18:34:15.741245 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Dec 12 18:34:15.741294 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Dec 12 18:34:15.741344 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Dec 12 18:34:15.741392 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Dec 12 18:34:15.741440 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Dec 12 18:34:15.741489 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Dec 12 18:34:15.741538 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Dec 12 18:34:15.741639 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Dec 12 18:34:15.741694 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Dec 12 18:34:15.741744 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Dec 12 18:34:15.741793 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Dec 12 18:34:15.741843 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Dec 12 18:34:15.741891 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Dec 12 18:34:15.741940 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Dec 12 18:34:15.741993 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Dec 12 18:34:15.742042 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Dec 12 18:34:15.742090 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Dec 12 18:34:15.742141 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Dec 12 18:34:15.742190 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Dec 12 18:34:15.742239 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Dec 12 18:34:15.742289 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Dec 12 18:34:15.742344 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Dec 12 18:34:15.742393 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Dec 12 18:34:15.742445 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Dec 12 18:34:15.742495 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Dec 12 18:34:15.742545 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Dec 12 18:34:15.742607 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Dec 12 18:34:15.742656 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Dec 12 18:34:15.742704 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Dec 12 18:34:15.743086 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Dec 12 18:34:15.743146 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Dec 12 18:34:15.743199 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Dec 12 18:34:15.743253 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Dec 12 18:34:15.743304 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Dec 12 18:34:15.743355 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Dec 12 18:34:15.743405 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Dec 12 18:34:15.743455 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Dec 12 18:34:15.743505 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Dec 12 18:34:15.744600 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Dec 12 18:34:15.744666 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Dec 12 18:34:15.744724 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Dec 12 18:34:15.744779 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Dec 12 18:34:15.744829 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Dec 12 18:34:15.744879 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Dec 12 18:34:15.744931 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Dec 12 18:34:15.744992 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Dec 12 18:34:15.745042 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Dec 12 18:34:15.745094 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Dec 12 18:34:15.745144 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Dec 12 18:34:15.745194 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Dec 12 18:34:15.745244 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Dec 12 18:34:15.745292 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Dec 12 18:34:15.745352 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Dec 12 18:34:15.745403 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Dec 12 18:34:15.745453 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Dec 12 18:34:15.745504 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Dec 12 18:34:15.745599 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Dec 12 18:34:15.745654 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Dec 12 18:34:15.745706 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Dec 12 18:34:15.745756 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Dec 12 18:34:15.745805 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Dec 12 18:34:15.745856 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Dec 12 18:34:15.745905 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Dec 12 18:34:15.745982 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Dec 12 18:34:15.746041 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Dec 12 18:34:15.746128 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Dec 12 18:34:15.746201 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Dec 12 18:34:15.746261 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Dec 12 18:34:15.746312 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Dec 12 18:34:15.746368 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Dec 12 18:34:15.746420 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Dec 12 18:34:15.746465 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Dec 12 18:34:15.746509 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Dec 12 18:34:15.747522 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Dec 12 18:34:15.747599 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Dec 12 18:34:15.747653 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Dec 12 18:34:15.747702 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Dec 12 18:34:15.747748 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Dec 12 18:34:15.747797 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Dec 12 18:34:15.747842 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Dec 12 18:34:15.747887 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Dec 12 18:34:15.747932 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Dec 12 18:34:15.747977 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Dec 12 18:34:15.748027 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Dec 12 18:34:15.748073 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Dec 12 18:34:15.748121 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Dec 12 18:34:15.748170 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Dec 12 18:34:15.748216 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Dec 12 18:34:15.748261 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Dec 12 18:34:15.748310 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Dec 12 18:34:15.748358 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Dec 12 18:34:15.748403 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Dec 12 18:34:15.748455 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Dec 12 18:34:15.748500 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Dec 12 18:34:15.748558 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Dec 12 18:34:15.748944 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Dec 12 18:34:15.748996 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Dec 12 18:34:15.749042 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Dec 12 18:34:15.749095 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Dec 12 18:34:15.749141 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Dec 12 18:34:15.749190 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Dec 12 18:34:15.749235 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Dec 12 18:34:15.749285 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Dec 12 18:34:15.749342 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Dec 12 18:34:15.749389 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Dec 12 18:34:15.749439 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Dec 12 18:34:15.749485 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Dec 12 18:34:15.749530 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Dec 12 18:34:15.749716 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Dec 12 18:34:15.749765 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Dec 12 18:34:15.749813 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Dec 12 18:34:15.749862 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Dec 12 18:34:15.749908 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Dec 12 18:34:15.749959 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Dec 12 18:34:15.750004 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Dec 12 18:34:15.750053 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Dec 12 18:34:15.750102 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Dec 12 18:34:15.750151 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Dec 12 18:34:15.750197 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Dec 12 18:34:15.750247 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Dec 12 18:34:15.750293 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Dec 12 18:34:15.750344 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Dec 12 18:34:15.750389 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Dec 12 18:34:15.750437 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Dec 12 18:34:15.750508 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Dec 12 18:34:15.750594 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Dec 12 18:34:15.750641 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Dec 12 18:34:15.750690 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Dec 12 18:34:15.750756 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Dec 12 18:34:15.750836 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Dec 12 18:34:15.750907 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Dec 12 18:34:15.750971 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Dec 12 18:34:15.751039 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Dec 12 18:34:15.751086 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Dec 12 18:34:15.751139 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Dec 12 18:34:15.751185 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Dec 12 18:34:15.751238 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Dec 12 18:34:15.751283 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Dec 12 18:34:15.751332 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Dec 12 18:34:15.751377 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Dec 12 18:34:15.751427 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Dec 12 18:34:15.751472 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Dec 12 18:34:15.751519 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Dec 12 18:34:15.751584 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Dec 12 18:34:15.751632 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Dec 12 18:34:15.751676 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Dec 12 18:34:15.751725 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Dec 12 18:34:15.751771 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Dec 12 18:34:15.751821 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Dec 12 18:34:15.751870 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Dec 12 18:34:15.751918 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Dec 12 18:34:15.751963 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Dec 12 18:34:15.752012 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Dec 12 18:34:15.752058 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Dec 12 18:34:15.752107 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Dec 12 18:34:15.752155 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Dec 12 18:34:15.752204 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Dec 12 18:34:15.752249 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Dec 12 18:34:15.752305 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Dec 12 18:34:15.752314 kernel: PCI: CLS 32 bytes, default 64 Dec 12 18:34:15.752325 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Dec 12 18:34:15.752332 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Dec 12 18:34:15.752340 kernel: clocksource: Switched to clocksource tsc Dec 12 18:34:15.752346 kernel: Initialise system trusted keyrings Dec 12 18:34:15.752353 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Dec 12 18:34:15.752359 kernel: Key type asymmetric registered Dec 12 18:34:15.752365 kernel: Asymmetric key parser 'x509' registered Dec 12 18:34:15.752371 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 12 18:34:15.752376 kernel: io scheduler mq-deadline registered Dec 12 18:34:15.752383 kernel: io scheduler kyber registered Dec 12 18:34:15.752388 kernel: io scheduler bfq registered Dec 12 18:34:15.752442 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Dec 12 18:34:15.752494 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.752546 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Dec 12 18:34:15.752610 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.752661 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Dec 12 18:34:15.752711 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.752762 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Dec 12 18:34:15.752815 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.752867 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Dec 12 18:34:15.752917 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.752968 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Dec 12 18:34:15.753017 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753068 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Dec 12 18:34:15.753117 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753170 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Dec 12 18:34:15.753220 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753271 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Dec 12 18:34:15.753321 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753370 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Dec 12 18:34:15.753419 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753469 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Dec 12 18:34:15.753520 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753578 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Dec 12 18:34:15.753629 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753689 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Dec 12 18:34:15.753742 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753793 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Dec 12 18:34:15.753844 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.753896 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Dec 12 18:34:15.753948 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754000 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Dec 12 18:34:15.754051 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754101 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Dec 12 18:34:15.754155 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754206 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Dec 12 18:34:15.754257 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754310 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Dec 12 18:34:15.754361 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754411 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Dec 12 18:34:15.754464 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754515 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Dec 12 18:34:15.754572 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754623 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Dec 12 18:34:15.754673 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754727 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Dec 12 18:34:15.754776 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754827 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Dec 12 18:34:15.754877 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.754927 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Dec 12 18:34:15.754978 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.755028 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Dec 12 18:34:15.755080 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.755131 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Dec 12 18:34:15.755180 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.755231 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Dec 12 18:34:15.755281 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.755335 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Dec 12 18:34:15.755385 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.755440 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Dec 12 18:34:15.755490 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.755540 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Dec 12 18:34:15.755608 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.755660 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Dec 12 18:34:15.755710 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Dec 12 18:34:15.755722 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 12 18:34:15.755729 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 18:34:15.755736 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 12 18:34:15.755743 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Dec 12 18:34:15.755749 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 12 18:34:15.755755 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 12 18:34:15.755806 kernel: rtc_cmos 00:01: registered as rtc0 Dec 12 18:34:15.755853 kernel: rtc_cmos 00:01: setting system clock to 2025-12-12T18:34:15 UTC (1765564455) Dec 12 18:34:15.755863 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 12 18:34:15.755908 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Dec 12 18:34:15.755917 kernel: intel_pstate: CPU model not supported Dec 12 18:34:15.755923 kernel: NET: Registered PF_INET6 protocol family Dec 12 18:34:15.755929 kernel: Segment Routing with IPv6 Dec 12 18:34:15.755936 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 18:34:15.755942 kernel: NET: Registered PF_PACKET protocol family Dec 12 18:34:15.755949 kernel: Key type dns_resolver registered Dec 12 18:34:15.755955 kernel: IPI shorthand broadcast: enabled Dec 12 18:34:15.755962 kernel: sched_clock: Marking stable (2665261527, 168339858)->(2848220832, -14619447) Dec 12 18:34:15.755970 kernel: registered taskstats version 1 Dec 12 18:34:15.755977 kernel: Loading compiled-in X.509 certificates Dec 12 18:34:15.755983 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 0d0c78e6590cb40d27f1cef749ef9f2f3425f38d' Dec 12 18:34:15.755989 kernel: Demotion targets for Node 0: null Dec 12 18:34:15.755996 kernel: Key type .fscrypt registered Dec 12 18:34:15.756002 kernel: Key type fscrypt-provisioning registered Dec 12 18:34:15.756008 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 18:34:15.756015 kernel: ima: Allocated hash algorithm: sha1 Dec 12 18:34:15.756021 kernel: ima: No architecture policies found Dec 12 18:34:15.756028 kernel: clk: Disabling unused clocks Dec 12 18:34:15.756034 kernel: Warning: unable to open an initial console. Dec 12 18:34:15.756041 kernel: Freeing unused kernel image (initmem) memory: 46188K Dec 12 18:34:15.756048 kernel: Write protecting the kernel read-only data: 40960k Dec 12 18:34:15.756055 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Dec 12 18:34:15.756061 kernel: Run /init as init process Dec 12 18:34:15.756067 kernel: with arguments: Dec 12 18:34:15.756074 kernel: /init Dec 12 18:34:15.756080 kernel: with environment: Dec 12 18:34:15.756087 kernel: HOME=/ Dec 12 18:34:15.756093 kernel: TERM=linux Dec 12 18:34:15.756100 systemd[1]: Successfully made /usr/ read-only. Dec 12 18:34:15.756109 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 18:34:15.756116 systemd[1]: Detected virtualization vmware. Dec 12 18:34:15.756122 systemd[1]: Detected architecture x86-64. Dec 12 18:34:15.756129 systemd[1]: Running in initrd. Dec 12 18:34:15.756135 systemd[1]: No hostname configured, using default hostname. Dec 12 18:34:15.756143 systemd[1]: Hostname set to . Dec 12 18:34:15.756149 systemd[1]: Initializing machine ID from random generator. Dec 12 18:34:15.756156 systemd[1]: Queued start job for default target initrd.target. Dec 12 18:34:15.756162 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 18:34:15.756168 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 18:34:15.756176 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 18:34:15.756182 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 18:34:15.756190 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 18:34:15.756197 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 18:34:15.756205 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 18:34:15.756211 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 18:34:15.756218 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 18:34:15.756224 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 18:34:15.756231 systemd[1]: Reached target paths.target - Path Units. Dec 12 18:34:15.756239 systemd[1]: Reached target slices.target - Slice Units. Dec 12 18:34:15.756245 systemd[1]: Reached target swap.target - Swaps. Dec 12 18:34:15.756251 systemd[1]: Reached target timers.target - Timer Units. Dec 12 18:34:15.756258 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 18:34:15.756265 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 18:34:15.756271 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 18:34:15.756278 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 18:34:15.756284 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 18:34:15.756291 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 18:34:15.756298 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 18:34:15.756305 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 18:34:15.756311 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 18:34:15.756322 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 18:34:15.756329 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 18:34:15.756336 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 18:34:15.756343 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 18:34:15.756349 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 18:34:15.756357 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 18:34:15.756363 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 18:34:15.756370 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 18:34:15.756377 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 18:34:15.756383 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 18:34:15.756391 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 18:34:15.756413 systemd-journald[224]: Collecting audit messages is disabled. Dec 12 18:34:15.756430 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 18:34:15.756437 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 18:34:15.756445 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 18:34:15.756452 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:34:15.756459 kernel: Bridge firewalling registered Dec 12 18:34:15.756465 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 18:34:15.756472 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 18:34:15.756479 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 18:34:15.756486 systemd-journald[224]: Journal started Dec 12 18:34:15.756501 systemd-journald[224]: Runtime Journal (/run/log/journal/2bc2899a0f21411bb1b93356e0f7a5b3) is 4.8M, max 38.5M, 33.7M free. Dec 12 18:34:15.705967 systemd-modules-load[226]: Inserted module 'overlay' Dec 12 18:34:15.739703 systemd-modules-load[226]: Inserted module 'br_netfilter' Dec 12 18:34:15.763857 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 18:34:15.766795 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 18:34:15.769570 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 18:34:15.769916 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 18:34:15.776398 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 18:34:15.777280 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 18:34:15.782114 systemd-tmpfiles[249]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 18:34:15.785654 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 18:34:15.787670 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 18:34:15.794265 dracut-cmdline[264]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 18:34:15.819723 systemd-resolved[269]: Positive Trust Anchors: Dec 12 18:34:15.819928 systemd-resolved[269]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 18:34:15.819952 systemd-resolved[269]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 18:34:15.822436 systemd-resolved[269]: Defaulting to hostname 'linux'. Dec 12 18:34:15.823290 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 18:34:15.823451 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 18:34:15.848571 kernel: SCSI subsystem initialized Dec 12 18:34:15.868584 kernel: Loading iSCSI transport class v2.0-870. Dec 12 18:34:15.877578 kernel: iscsi: registered transport (tcp) Dec 12 18:34:15.901601 kernel: iscsi: registered transport (qla4xxx) Dec 12 18:34:15.901671 kernel: QLogic iSCSI HBA Driver Dec 12 18:34:15.912269 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 18:34:15.926661 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 18:34:15.927764 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 18:34:15.951818 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 18:34:15.953065 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 18:34:15.996575 kernel: raid6: avx2x4 gen() 38872 MB/s Dec 12 18:34:16.013579 kernel: raid6: avx2x2 gen() 44878 MB/s Dec 12 18:34:16.030994 kernel: raid6: avx2x1 gen() 32645 MB/s Dec 12 18:34:16.031045 kernel: raid6: using algorithm avx2x2 gen() 44878 MB/s Dec 12 18:34:16.048974 kernel: raid6: .... xor() 24848 MB/s, rmw enabled Dec 12 18:34:16.049038 kernel: raid6: using avx2x2 recovery algorithm Dec 12 18:34:16.065587 kernel: xor: automatically using best checksumming function avx Dec 12 18:34:16.186585 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 18:34:16.190970 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 18:34:16.192354 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 18:34:16.213011 systemd-udevd[475]: Using default interface naming scheme 'v255'. Dec 12 18:34:16.217281 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 18:34:16.218638 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 18:34:16.240270 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation Dec 12 18:34:16.256724 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 18:34:16.257484 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 18:34:16.333850 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 18:34:16.335452 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 18:34:16.415564 kernel: VMware PVSCSI driver - version 1.0.7.0-k Dec 12 18:34:16.420565 kernel: vmw_pvscsi: using 64bit dma Dec 12 18:34:16.422964 kernel: vmw_pvscsi: max_id: 16 Dec 12 18:34:16.422980 kernel: vmw_pvscsi: setting ring_pages to 8 Dec 12 18:34:16.426796 kernel: vmw_pvscsi: enabling reqCallThreshold Dec 12 18:34:16.426813 kernel: vmw_pvscsi: driver-based request coalescing enabled Dec 12 18:34:16.426827 kernel: vmw_pvscsi: using MSI-X Dec 12 18:34:16.431564 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Dec 12 18:34:16.435587 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Dec 12 18:34:16.436979 kernel: libata version 3.00 loaded. Dec 12 18:34:16.437001 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Dec 12 18:34:16.442442 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Dec 12 18:34:16.442589 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Dec 12 18:34:16.444571 kernel: ata_piix 0000:00:07.1: version 2.13 Dec 12 18:34:16.445576 kernel: scsi host1: ata_piix Dec 12 18:34:16.447389 kernel: scsi host2: ata_piix Dec 12 18:34:16.447491 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Dec 12 18:34:16.447501 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Dec 12 18:34:16.450566 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Dec 12 18:34:16.461405 (udev-worker)[520]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Dec 12 18:34:16.461709 kernel: cryptd: max_cpu_qlen set to 1000 Dec 12 18:34:16.462230 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 18:34:16.462779 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:34:16.464420 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 18:34:16.465156 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 18:34:16.488994 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:34:16.618589 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Dec 12 18:34:16.629018 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Dec 12 18:34:16.631960 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Dec 12 18:34:16.636568 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Dec 12 18:34:16.638977 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Dec 12 18:34:16.639089 kernel: sd 0:0:0:0: [sda] Write Protect is off Dec 12 18:34:16.639155 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Dec 12 18:34:16.639575 kernel: sd 0:0:0:0: [sda] Cache data unavailable Dec 12 18:34:16.641340 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Dec 12 18:34:16.646567 kernel: AES CTR mode by8 optimization enabled Dec 12 18:34:16.725570 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 18:34:16.725611 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Dec 12 18:34:16.747610 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Dec 12 18:34:16.747794 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 12 18:34:16.760571 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Dec 12 18:34:16.886145 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Dec 12 18:34:16.897920 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Dec 12 18:34:16.905980 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Dec 12 18:34:16.913300 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Dec 12 18:34:16.913781 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Dec 12 18:34:16.914821 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 18:34:17.037064 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 18:34:17.074577 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 18:34:17.336592 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 18:34:17.351408 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 18:34:17.351609 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 18:34:17.351906 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 18:34:17.352826 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 18:34:17.368104 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 18:34:18.047572 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 12 18:34:18.048357 disk-uuid[633]: The operation has completed successfully. Dec 12 18:34:18.185415 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 18:34:18.185516 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 18:34:18.186719 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 18:34:18.198405 sh[660]: Success Dec 12 18:34:18.216615 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 18:34:18.216671 kernel: device-mapper: uevent: version 1.0.3 Dec 12 18:34:18.218574 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 18:34:18.225595 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Dec 12 18:34:18.374379 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 18:34:18.375177 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 18:34:18.393319 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 18:34:18.468587 kernel: BTRFS: device fsid a6ae7f96-a076-4d3c-81ed-46dd341492f8 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (672) Dec 12 18:34:18.471315 kernel: BTRFS info (device dm-0): first mount of filesystem a6ae7f96-a076-4d3c-81ed-46dd341492f8 Dec 12 18:34:18.471341 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 12 18:34:18.481155 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 12 18:34:18.481210 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 18:34:18.481229 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 18:34:18.483253 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 18:34:18.483749 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 18:34:18.484447 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Dec 12 18:34:18.486684 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 18:34:18.545575 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (695) Dec 12 18:34:18.560462 kernel: BTRFS info (device sda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:34:18.560511 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 18:34:18.620574 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 18:34:18.620630 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 18:34:18.624593 kernel: BTRFS info (device sda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:34:18.625852 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 18:34:18.628656 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 18:34:18.678392 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Dec 12 18:34:18.679086 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 18:34:18.740203 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 18:34:18.741212 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 18:34:18.770594 systemd-networkd[846]: lo: Link UP Dec 12 18:34:18.770600 systemd-networkd[846]: lo: Gained carrier Dec 12 18:34:18.771509 systemd-networkd[846]: Enumeration completed Dec 12 18:34:18.771657 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 18:34:18.771837 systemd-networkd[846]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Dec 12 18:34:18.771933 systemd[1]: Reached target network.target - Network. Dec 12 18:34:18.775615 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Dec 12 18:34:18.775738 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Dec 12 18:34:18.775467 systemd-networkd[846]: ens192: Link UP Dec 12 18:34:18.775469 systemd-networkd[846]: ens192: Gained carrier Dec 12 18:34:18.878975 ignition[714]: Ignition 2.22.0 Dec 12 18:34:18.879253 ignition[714]: Stage: fetch-offline Dec 12 18:34:18.879480 ignition[714]: no configs at "/usr/lib/ignition/base.d" Dec 12 18:34:18.879487 ignition[714]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Dec 12 18:34:18.879570 ignition[714]: parsed url from cmdline: "" Dec 12 18:34:18.879572 ignition[714]: no config URL provided Dec 12 18:34:18.879575 ignition[714]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 18:34:18.879580 ignition[714]: no config at "/usr/lib/ignition/user.ign" Dec 12 18:34:18.879976 ignition[714]: config successfully fetched Dec 12 18:34:18.879997 ignition[714]: parsing config with SHA512: 17e0b14cf55c61016e97ed1fa9550c94b382ad47b55efe5adb6d020b102db40e18763befb46a8608ede5a4200129a56845493a01bb32455474f4835797bdd342 Dec 12 18:34:18.884618 unknown[714]: fetched base config from "system" Dec 12 18:34:18.884957 unknown[714]: fetched user config from "vmware" Dec 12 18:34:18.885287 ignition[714]: fetch-offline: fetch-offline passed Dec 12 18:34:18.885322 ignition[714]: Ignition finished successfully Dec 12 18:34:18.886767 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 18:34:18.887027 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Dec 12 18:34:18.887650 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 18:34:18.906569 ignition[855]: Ignition 2.22.0 Dec 12 18:34:18.906574 ignition[855]: Stage: kargs Dec 12 18:34:18.906657 ignition[855]: no configs at "/usr/lib/ignition/base.d" Dec 12 18:34:18.906663 ignition[855]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Dec 12 18:34:18.907227 ignition[855]: kargs: kargs passed Dec 12 18:34:18.907259 ignition[855]: Ignition finished successfully Dec 12 18:34:18.908766 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 18:34:18.909529 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 18:34:18.926983 ignition[861]: Ignition 2.22.0 Dec 12 18:34:18.927000 ignition[861]: Stage: disks Dec 12 18:34:18.927083 ignition[861]: no configs at "/usr/lib/ignition/base.d" Dec 12 18:34:18.927089 ignition[861]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Dec 12 18:34:18.927699 ignition[861]: disks: disks passed Dec 12 18:34:18.927729 ignition[861]: Ignition finished successfully Dec 12 18:34:18.928673 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 18:34:18.928898 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 18:34:18.929026 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 18:34:18.929217 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 18:34:18.929405 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 18:34:18.929591 systemd[1]: Reached target basic.target - Basic System. Dec 12 18:34:18.930249 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 18:34:18.951891 systemd-fsck[869]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 12 18:34:18.952998 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 18:34:18.953961 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 18:34:19.070526 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 18:34:19.070723 kernel: EXT4-fs (sda9): mounted filesystem e48ca59c-1206-4abd-b121-5e9b35e49852 r/w with ordered data mode. Quota mode: none. Dec 12 18:34:19.071154 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 18:34:19.072564 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 18:34:19.074605 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 18:34:19.075182 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 12 18:34:19.075420 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 18:34:19.075445 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 18:34:19.084527 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 18:34:19.086645 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 18:34:19.091424 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (878) Dec 12 18:34:19.091457 kernel: BTRFS info (device sda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:34:19.091466 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 18:34:19.098035 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 18:34:19.098084 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 18:34:19.100266 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 18:34:19.254753 initrd-setup-root[903]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 18:34:19.258874 initrd-setup-root[910]: cut: /sysroot/etc/group: No such file or directory Dec 12 18:34:19.261348 initrd-setup-root[917]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 18:34:19.264644 initrd-setup-root[924]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 18:34:19.380804 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 18:34:19.381739 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 18:34:19.382667 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 18:34:19.397619 kernel: BTRFS info (device sda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:34:19.416714 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 18:34:19.422509 ignition[992]: INFO : Ignition 2.22.0 Dec 12 18:34:19.422897 ignition[992]: INFO : Stage: mount Dec 12 18:34:19.423148 ignition[992]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 18:34:19.423288 ignition[992]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Dec 12 18:34:19.424185 ignition[992]: INFO : mount: mount passed Dec 12 18:34:19.424370 ignition[992]: INFO : Ignition finished successfully Dec 12 18:34:19.425658 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 18:34:19.426500 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 18:34:19.466694 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 18:34:19.467891 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 18:34:19.573589 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1004) Dec 12 18:34:19.590244 kernel: BTRFS info (device sda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 18:34:19.590299 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 18:34:19.644711 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 12 18:34:19.644772 kernel: BTRFS info (device sda6): enabling free space tree Dec 12 18:34:19.645848 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 18:34:19.669810 ignition[1020]: INFO : Ignition 2.22.0 Dec 12 18:34:19.669810 ignition[1020]: INFO : Stage: files Dec 12 18:34:19.670229 ignition[1020]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 18:34:19.670229 ignition[1020]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Dec 12 18:34:19.670472 ignition[1020]: DEBUG : files: compiled without relabeling support, skipping Dec 12 18:34:19.686131 ignition[1020]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 18:34:19.686131 ignition[1020]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 18:34:19.716048 ignition[1020]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 18:34:19.716653 ignition[1020]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 18:34:19.716974 unknown[1020]: wrote ssh authorized keys file for user: core Dec 12 18:34:19.717326 ignition[1020]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 18:34:19.738395 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 12 18:34:19.738395 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Dec 12 18:34:19.780722 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 18:34:19.908594 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 12 18:34:19.908594 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 18:34:19.909131 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 18:34:19.909131 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 18:34:19.910238 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 18:34:19.910238 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 18:34:19.910238 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 18:34:19.910238 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 18:34:19.910238 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 18:34:19.919646 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 18:34:19.919920 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 18:34:19.919920 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 12 18:34:19.924817 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 12 18:34:19.924817 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 12 18:34:19.925429 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Dec 12 18:34:20.374835 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 18:34:20.838699 systemd-networkd[846]: ens192: Gained IPv6LL Dec 12 18:34:21.018363 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 12 18:34:21.018800 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Dec 12 18:34:21.031175 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Dec 12 18:34:21.031175 ignition[1020]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Dec 12 18:34:21.039644 ignition[1020]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 18:34:21.044234 ignition[1020]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 18:34:21.044234 ignition[1020]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Dec 12 18:34:21.044234 ignition[1020]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Dec 12 18:34:21.044842 ignition[1020]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Dec 12 18:34:21.044842 ignition[1020]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Dec 12 18:34:21.044842 ignition[1020]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Dec 12 18:34:21.044842 ignition[1020]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Dec 12 18:34:21.396326 ignition[1020]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Dec 12 18:34:21.398888 ignition[1020]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Dec 12 18:34:21.399121 ignition[1020]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Dec 12 18:34:21.399121 ignition[1020]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Dec 12 18:34:21.399121 ignition[1020]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 18:34:21.400412 ignition[1020]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 18:34:21.400412 ignition[1020]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 18:34:21.400412 ignition[1020]: INFO : files: files passed Dec 12 18:34:21.400412 ignition[1020]: INFO : Ignition finished successfully Dec 12 18:34:21.400096 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 18:34:21.401638 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 18:34:21.402346 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 18:34:21.409158 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 18:34:21.409235 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 18:34:21.413622 initrd-setup-root-after-ignition[1053]: grep: Dec 12 18:34:21.413982 initrd-setup-root-after-ignition[1057]: grep: Dec 12 18:34:21.413982 initrd-setup-root-after-ignition[1053]: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 18:34:21.413982 initrd-setup-root-after-ignition[1053]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 18:34:21.414889 initrd-setup-root-after-ignition[1057]: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 18:34:21.415661 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 18:34:21.416054 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 18:34:21.416984 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 18:34:21.447411 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 18:34:21.447539 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 18:34:21.448152 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 18:34:21.448273 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 18:34:21.448519 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 18:34:21.449187 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 18:34:21.469998 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 18:34:21.470917 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 18:34:21.485673 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 18:34:21.485980 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 18:34:21.486173 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 18:34:21.486389 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 18:34:21.486482 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 18:34:21.486895 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 18:34:21.487059 systemd[1]: Stopped target basic.target - Basic System. Dec 12 18:34:21.487236 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 18:34:21.487422 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 18:34:21.487664 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 18:34:21.487887 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 18:34:21.488089 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 18:34:21.488295 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 18:34:21.488517 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 18:34:21.488743 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 18:34:21.488943 systemd[1]: Stopped target swap.target - Swaps. Dec 12 18:34:21.489112 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 18:34:21.489196 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 18:34:21.489560 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 18:34:21.489730 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 18:34:21.489931 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 18:34:21.489983 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 18:34:21.490171 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 18:34:21.490246 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 18:34:21.490645 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 18:34:21.490724 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 18:34:21.490987 systemd[1]: Stopped target paths.target - Path Units. Dec 12 18:34:21.491124 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 18:34:21.494581 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 18:34:21.494762 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 18:34:21.494986 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 18:34:21.495161 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 18:34:21.495225 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 18:34:21.495386 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 18:34:21.495431 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 18:34:21.495630 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 18:34:21.495714 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 18:34:21.495959 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 18:34:21.496030 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 18:34:21.496861 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 18:34:21.498651 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 18:34:21.498764 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 18:34:21.498834 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 18:34:21.499023 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 18:34:21.499102 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 18:34:21.502803 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 18:34:21.504760 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 18:34:21.516151 ignition[1077]: INFO : Ignition 2.22.0 Dec 12 18:34:21.516517 ignition[1077]: INFO : Stage: umount Dec 12 18:34:21.516744 ignition[1077]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 18:34:21.516894 ignition[1077]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Dec 12 18:34:21.517767 ignition[1077]: INFO : umount: umount passed Dec 12 18:34:21.517953 ignition[1077]: INFO : Ignition finished successfully Dec 12 18:34:21.518971 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 18:34:21.519047 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 18:34:21.519284 systemd[1]: Stopped target network.target - Network. Dec 12 18:34:21.519388 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 18:34:21.519416 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 18:34:21.519595 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 18:34:21.519618 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 18:34:21.519732 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 18:34:21.519752 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 18:34:21.519901 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 18:34:21.519922 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 18:34:21.520124 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 18:34:21.520399 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 18:34:21.521723 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 18:34:21.521786 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 18:34:21.523102 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 18:34:21.523364 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 18:34:21.523404 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 18:34:21.524343 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 18:34:21.528209 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 18:34:21.528280 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 18:34:21.529197 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 18:34:21.529388 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 18:34:21.529692 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 18:34:21.529719 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 18:34:21.530564 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 18:34:21.530657 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 18:34:21.530684 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 18:34:21.530813 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Dec 12 18:34:21.530836 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Dec 12 18:34:21.530953 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 18:34:21.530982 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 18:34:21.531127 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 18:34:21.531155 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 18:34:21.531273 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 18:34:21.532048 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 18:34:21.541077 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 18:34:21.541146 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 18:34:21.542769 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 18:34:21.542854 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 18:34:21.543197 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 18:34:21.543228 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 18:34:21.543355 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 18:34:21.543371 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 18:34:21.543528 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 18:34:21.543557 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 18:34:21.543842 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 18:34:21.543866 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 18:34:21.544160 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 18:34:21.544182 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 18:34:21.545625 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 18:34:21.545877 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 18:34:21.546020 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 18:34:21.546876 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 18:34:21.547043 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 18:34:21.547363 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 12 18:34:21.547509 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 18:34:21.547856 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 18:34:21.547882 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 18:34:21.548269 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 18:34:21.548436 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:34:21.549520 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 12 18:34:21.549550 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Dec 12 18:34:21.549578 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 12 18:34:21.549615 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 18:34:21.552880 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 18:34:21.553132 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 18:34:21.913853 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 18:34:21.913945 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 18:34:21.914314 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 18:34:21.914461 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 18:34:21.914494 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 18:34:21.915183 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 18:34:21.931240 systemd[1]: Switching root. Dec 12 18:34:21.980820 systemd-journald[224]: Journal stopped Dec 12 18:34:24.903965 systemd-journald[224]: Received SIGTERM from PID 1 (systemd). Dec 12 18:34:24.904001 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 18:34:24.904009 kernel: SELinux: policy capability open_perms=1 Dec 12 18:34:24.904015 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 18:34:24.904022 kernel: SELinux: policy capability always_check_network=0 Dec 12 18:34:24.904030 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 18:34:24.904039 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 18:34:24.904047 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 18:34:24.904053 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 18:34:24.904058 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 18:34:24.904065 systemd[1]: Successfully loaded SELinux policy in 103.387ms. Dec 12 18:34:24.904074 kernel: audit: type=1403 audit(1765564463.357:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 18:34:24.904080 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 4.378ms. Dec 12 18:34:24.904088 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 18:34:24.904097 systemd[1]: Detected virtualization vmware. Dec 12 18:34:24.904103 systemd[1]: Detected architecture x86-64. Dec 12 18:34:24.904110 systemd[1]: Detected first boot. Dec 12 18:34:24.904119 systemd[1]: Initializing machine ID from random generator. Dec 12 18:34:24.904127 zram_generator::config[1120]: No configuration found. Dec 12 18:34:24.904249 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Dec 12 18:34:24.904266 kernel: Guest personality initialized and is active Dec 12 18:34:24.904277 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 12 18:34:24.904287 kernel: Initialized host personality Dec 12 18:34:24.904297 kernel: NET: Registered PF_VSOCK protocol family Dec 12 18:34:24.904310 systemd[1]: Populated /etc with preset unit settings. Dec 12 18:34:24.904323 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Dec 12 18:34:24.904335 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Dec 12 18:34:24.904346 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 18:34:24.904357 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 18:34:24.904368 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 18:34:24.904379 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 18:34:24.904393 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 18:34:24.904405 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 18:34:24.904417 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 18:34:24.904428 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 18:34:24.904440 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 18:34:24.904452 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 18:34:24.904464 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 18:34:24.904475 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 18:34:24.904488 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 18:34:24.904502 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 18:34:24.904515 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 18:34:24.904526 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 18:34:24.904537 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 18:34:24.904549 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 18:34:24.904581 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 12 18:34:24.904594 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 18:34:24.904609 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 18:34:24.904622 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 18:34:24.904634 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 18:34:24.904646 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 18:34:24.904657 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 18:34:24.904669 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 18:34:24.904679 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 18:34:24.904686 systemd[1]: Reached target slices.target - Slice Units. Dec 12 18:34:24.904698 systemd[1]: Reached target swap.target - Swaps. Dec 12 18:34:24.904709 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 18:34:24.904717 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 18:34:24.904723 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 18:34:24.904730 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 18:34:24.904740 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 18:34:24.904752 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 18:34:24.904760 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 18:34:24.904770 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 18:34:24.904778 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 18:34:24.904786 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 18:34:24.904793 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:34:24.904799 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 18:34:24.904812 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 18:34:24.904822 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 18:34:24.904830 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 18:34:24.904841 systemd[1]: Reached target machines.target - Containers. Dec 12 18:34:24.904853 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 18:34:24.904865 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Dec 12 18:34:24.904877 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 18:34:24.904886 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 18:34:24.904895 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 18:34:24.904905 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 18:34:24.904915 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 18:34:24.904925 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 18:34:24.904936 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 18:34:24.904948 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 18:34:24.904956 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 18:34:24.904964 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 18:34:24.904971 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 18:34:24.904981 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 18:34:24.904989 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 18:34:24.904997 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 18:34:24.905004 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 18:34:24.905011 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 18:34:24.905018 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 18:34:24.905027 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 18:34:24.905038 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 18:34:24.905047 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 18:34:24.905055 systemd[1]: Stopped verity-setup.service. Dec 12 18:34:24.905062 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:34:24.905069 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 18:34:24.905076 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 18:34:24.905083 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 18:34:24.905090 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 18:34:24.905100 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 18:34:24.905113 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 18:34:24.905121 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 18:34:24.905128 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 18:34:24.905137 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 18:34:24.905146 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 18:34:24.905157 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 18:34:24.905165 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 18:34:24.905172 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 18:34:24.905179 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 18:34:24.905187 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 18:34:24.905194 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 18:34:24.905201 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 18:34:24.905232 systemd-journald[1203]: Collecting audit messages is disabled. Dec 12 18:34:24.905260 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 18:34:24.905273 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 18:34:24.905282 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 18:34:24.905293 systemd-journald[1203]: Journal started Dec 12 18:34:24.905326 systemd-journald[1203]: Runtime Journal (/run/log/journal/5f20662f4fd14f96b4a4f839733657d4) is 4.8M, max 38.5M, 33.7M free. Dec 12 18:34:24.679091 systemd[1]: Queued start job for default target multi-user.target. Dec 12 18:34:24.909659 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 18:34:24.909697 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 18:34:24.691677 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 12 18:34:24.691927 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 18:34:24.910156 jq[1190]: true Dec 12 18:34:24.915597 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 18:34:24.915644 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 18:34:24.918585 kernel: fuse: init (API version 7.41) Dec 12 18:34:24.921417 jq[1220]: true Dec 12 18:34:24.925563 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 18:34:24.926600 kernel: loop: module loaded Dec 12 18:34:24.937566 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 18:34:24.940592 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 18:34:24.941731 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 18:34:24.941867 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 18:34:24.942112 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 18:34:24.942223 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 18:34:24.942450 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 18:34:24.942548 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 18:34:24.942807 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 18:34:24.953674 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 18:34:24.957594 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 18:34:24.960585 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 18:34:24.960725 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 18:34:24.966331 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 18:34:24.967648 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 18:34:24.980592 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 18:34:24.980848 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 18:34:24.983628 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 18:34:24.993929 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 18:34:25.009572 kernel: loop0: detected capacity change from 0 to 128560 Dec 12 18:34:25.003804 systemd-tmpfiles[1223]: ACLs are not supported, ignoring. Dec 12 18:34:25.003818 systemd-tmpfiles[1223]: ACLs are not supported, ignoring. Dec 12 18:34:25.010276 ignition[1242]: Ignition 2.22.0 Dec 12 18:34:25.072731 systemd-journald[1203]: Time spent on flushing to /var/log/journal/5f20662f4fd14f96b4a4f839733657d4 is 35.993ms for 1769 entries. Dec 12 18:34:25.072731 systemd-journald[1203]: System Journal (/var/log/journal/5f20662f4fd14f96b4a4f839733657d4) is 8M, max 584.8M, 576.8M free. Dec 12 18:34:25.130752 systemd-journald[1203]: Received client request to flush runtime journal. Dec 12 18:34:25.130794 kernel: ACPI: bus type drm_connector registered Dec 12 18:34:25.070821 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 18:34:25.012687 ignition[1242]: deleting config from guestinfo properties Dec 12 18:34:25.071177 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 18:34:25.104561 ignition[1242]: Successfully deleted config Dec 12 18:34:25.080621 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 18:34:25.081045 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 18:34:25.081663 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 18:34:25.081849 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 18:34:25.106222 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Dec 12 18:34:25.121602 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 18:34:25.134240 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 18:34:25.138576 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 18:34:25.166589 kernel: loop1: detected capacity change from 0 to 2960 Dec 12 18:34:25.362218 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 18:34:25.365678 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 18:34:25.390570 kernel: loop2: detected capacity change from 0 to 110984 Dec 12 18:34:25.394003 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Dec 12 18:34:25.394022 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Dec 12 18:34:25.402412 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 18:34:25.422578 kernel: loop3: detected capacity change from 0 to 224512 Dec 12 18:34:25.591784 kernel: loop4: detected capacity change from 0 to 128560 Dec 12 18:34:25.649577 kernel: loop5: detected capacity change from 0 to 2960 Dec 12 18:34:25.692920 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 18:34:25.846925 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 18:34:25.850704 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 18:34:25.851687 kernel: loop6: detected capacity change from 0 to 110984 Dec 12 18:34:25.878515 systemd-udevd[1301]: Using default interface naming scheme 'v255'. Dec 12 18:34:26.022573 kernel: loop7: detected capacity change from 0 to 224512 Dec 12 18:34:26.144830 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 18:34:26.147722 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 18:34:26.202784 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 18:34:26.209757 (sd-merge)[1299]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Dec 12 18:34:26.210922 (sd-merge)[1299]: Merged extensions into '/usr'. Dec 12 18:34:26.220978 systemd[1]: Reload requested from client PID 1241 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 18:34:26.220991 systemd[1]: Reloading... Dec 12 18:34:26.298590 zram_generator::config[1363]: No configuration found. Dec 12 18:34:26.369940 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Dec 12 18:34:26.372827 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 18:34:26.377569 kernel: ACPI: button: Power Button [PWRF] Dec 12 18:34:26.461694 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Dec 12 18:34:26.477042 systemd-networkd[1303]: lo: Link UP Dec 12 18:34:26.477047 systemd-networkd[1303]: lo: Gained carrier Dec 12 18:34:26.479716 systemd-networkd[1303]: Enumeration completed Dec 12 18:34:26.479956 systemd-networkd[1303]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Dec 12 18:34:26.481628 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Dec 12 18:34:26.481777 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Dec 12 18:34:26.488094 systemd-networkd[1303]: ens192: Link UP Dec 12 18:34:26.488189 systemd-networkd[1303]: ens192: Gained carrier Dec 12 18:34:26.491564 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Dec 12 18:34:26.560048 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Dec 12 18:34:26.560327 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 12 18:34:26.560415 systemd[1]: Reloading finished in 337 ms. Dec 12 18:34:26.573948 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 18:34:26.575676 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 18:34:26.582861 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 18:34:26.613629 (udev-worker)[1307]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Dec 12 18:34:26.618681 systemd[1]: Starting ensure-sysext.service... Dec 12 18:34:26.620741 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 18:34:26.625058 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 18:34:26.628786 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 18:34:26.630867 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 18:34:26.656647 systemd[1]: Reload requested from client PID 1441 ('systemctl') (unit ensure-sysext.service)... Dec 12 18:34:26.656659 systemd[1]: Reloading... Dec 12 18:34:26.669890 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 18:34:26.670091 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 18:34:26.670330 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 18:34:26.670531 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 18:34:26.671074 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 18:34:26.671290 systemd-tmpfiles[1445]: ACLs are not supported, ignoring. Dec 12 18:34:26.671360 systemd-tmpfiles[1445]: ACLs are not supported, ignoring. Dec 12 18:34:26.698567 zram_generator::config[1478]: No configuration found. Dec 12 18:34:26.702774 systemd-tmpfiles[1445]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 18:34:26.702780 systemd-tmpfiles[1445]: Skipping /boot Dec 12 18:34:26.708908 systemd-tmpfiles[1445]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 18:34:26.708970 systemd-tmpfiles[1445]: Skipping /boot Dec 12 18:34:26.793816 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Dec 12 18:34:26.859085 systemd[1]: Reloading finished in 202 ms. Dec 12 18:34:26.882949 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 18:34:26.883593 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 18:34:26.884131 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 18:34:26.891616 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:34:26.892793 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 18:34:26.904454 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 18:34:26.906684 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 18:34:26.907426 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 18:34:26.908137 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 18:34:26.908327 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 18:34:26.908399 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 18:34:26.910719 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 18:34:26.913183 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 18:34:26.914094 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 18:34:26.915117 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 18:34:26.915233 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:34:26.917966 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:34:26.918083 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 18:34:26.918143 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 18:34:26.918211 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:34:26.920510 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:34:26.925005 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 18:34:26.925206 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 18:34:26.925285 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 18:34:26.925379 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 18:34:26.927583 systemd[1]: Finished ensure-sysext.service. Dec 12 18:34:26.930249 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 18:34:26.930394 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 18:34:26.937780 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 12 18:34:26.938222 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 18:34:26.938382 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 18:34:26.938840 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 18:34:26.939597 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 18:34:26.945312 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 18:34:26.946761 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 18:34:26.947302 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 18:34:26.947350 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 18:34:26.965683 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 18:34:27.011074 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 18:34:27.021502 systemd-resolved[1543]: Positive Trust Anchors: Dec 12 18:34:27.021512 systemd-resolved[1543]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 18:34:27.021536 systemd-resolved[1543]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 18:34:27.027523 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 12 18:34:27.027782 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 18:34:27.043288 systemd-resolved[1543]: Defaulting to hostname 'linux'. Dec 12 18:34:27.044355 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 18:34:27.044499 systemd[1]: Reached target network.target - Network. Dec 12 18:34:27.044590 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 18:34:27.053120 augenrules[1579]: No rules Dec 12 18:34:27.053893 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 18:34:27.054091 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 18:34:27.180845 ldconfig[1229]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 18:34:27.184296 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 18:34:27.186805 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 18:34:27.193899 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 18:34:27.207207 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 18:34:27.263906 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 18:34:27.264303 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 18:34:27.264399 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 18:34:27.264650 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 18:34:27.264825 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 18:34:27.264991 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 12 18:34:27.265220 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 18:34:27.265399 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 18:34:27.265570 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 18:34:27.265731 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 18:34:27.265748 systemd[1]: Reached target paths.target - Path Units. Dec 12 18:34:27.265901 systemd[1]: Reached target timers.target - Timer Units. Dec 12 18:36:09.041006 systemd-resolved[1543]: Clock change detected. Flushing caches. Dec 12 18:36:09.041044 systemd-timesyncd[1551]: Contacted time server 45.55.58.103:123 (0.flatcar.pool.ntp.org). Dec 12 18:36:09.041113 systemd-timesyncd[1551]: Initial clock synchronization to Fri 2025-12-12 18:36:09.040963 UTC. Dec 12 18:36:09.047880 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 18:36:09.049265 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 18:36:09.051108 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 18:36:09.051398 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 18:36:09.051582 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 18:36:09.055311 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 18:36:09.063620 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 18:36:09.064373 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 18:36:09.065246 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 18:36:09.065394 systemd[1]: Reached target basic.target - Basic System. Dec 12 18:36:09.065569 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 18:36:09.065595 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 18:36:09.066579 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 18:36:09.069966 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 18:36:09.084731 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 18:36:09.086592 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 18:36:09.088952 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 18:36:09.089106 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 18:36:09.097107 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 12 18:36:09.099279 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 18:36:09.102661 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 18:36:09.106918 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 18:36:09.109015 jq[1598]: false Dec 12 18:36:09.109864 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 18:36:09.117623 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 18:36:09.118759 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 18:36:09.121042 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 18:36:09.130390 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Refreshing passwd entry cache Dec 12 18:36:09.126716 oslogin_cache_refresh[1600]: Refreshing passwd entry cache Dec 12 18:36:09.126814 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 18:36:09.132373 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Failure getting users, quitting Dec 12 18:36:09.132373 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 12 18:36:09.132373 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Refreshing group entry cache Dec 12 18:36:09.131957 oslogin_cache_refresh[1600]: Failure getting users, quitting Dec 12 18:36:09.130682 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 18:36:09.131972 oslogin_cache_refresh[1600]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 12 18:36:09.132007 oslogin_cache_refresh[1600]: Refreshing group entry cache Dec 12 18:36:09.133906 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Dec 12 18:36:09.136361 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Failure getting groups, quitting Dec 12 18:36:09.136361 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 12 18:36:09.136352 oslogin_cache_refresh[1600]: Failure getting groups, quitting Dec 12 18:36:09.136363 oslogin_cache_refresh[1600]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 12 18:36:09.140958 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 18:36:09.141295 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 18:36:09.141727 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 18:36:09.141976 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 12 18:36:09.142369 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 12 18:36:09.146377 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 18:36:09.146519 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 18:36:09.158171 extend-filesystems[1599]: Found /dev/sda6 Dec 12 18:36:09.162883 jq[1610]: true Dec 12 18:36:09.170731 extend-filesystems[1599]: Found /dev/sda9 Dec 12 18:36:09.170731 extend-filesystems[1599]: Checking size of /dev/sda9 Dec 12 18:36:09.176916 (ntainerd)[1626]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 18:36:09.183252 update_engine[1609]: I20251212 18:36:09.183177 1609 main.cc:92] Flatcar Update Engine starting Dec 12 18:36:09.187501 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Dec 12 18:36:09.190047 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Dec 12 18:36:09.194231 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 18:36:09.194468 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 18:36:09.205109 extend-filesystems[1599]: Old size kept for /dev/sda9 Dec 12 18:36:09.205376 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 18:36:09.207333 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 18:36:09.217750 jq[1633]: true Dec 12 18:36:09.226898 tar[1615]: linux-amd64/LICENSE Dec 12 18:36:09.227070 tar[1615]: linux-amd64/helm Dec 12 18:36:09.239415 systemd-logind[1608]: Watching system buttons on /dev/input/event2 (Power Button) Dec 12 18:36:09.239432 systemd-logind[1608]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 12 18:36:09.239567 systemd-logind[1608]: New seat seat0. Dec 12 18:36:09.241053 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 18:36:09.262336 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Dec 12 18:36:09.285500 unknown[1638]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Dec 12 18:36:09.299303 unknown[1638]: Core dump limit set to -1 Dec 12 18:36:09.336192 dbus-daemon[1596]: [system] SELinux support is enabled Dec 12 18:36:09.337613 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 18:36:09.339172 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 18:36:09.339187 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 18:36:09.339311 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 18:36:09.339320 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 18:36:09.342288 dbus-daemon[1596]: [system] Successfully activated service 'org.freedesktop.systemd1' Dec 12 18:36:09.343368 systemd[1]: Started update-engine.service - Update Engine. Dec 12 18:36:09.343871 update_engine[1609]: I20251212 18:36:09.343252 1609 update_check_scheduler.cc:74] Next update check in 3m33s Dec 12 18:36:09.344815 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 18:36:09.410494 bash[1664]: Updated "/home/core/.ssh/authorized_keys" Dec 12 18:36:09.413110 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 18:36:09.413857 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Dec 12 18:36:09.460110 sshd_keygen[1636]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 18:36:09.498920 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 18:36:09.501936 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 18:36:09.510402 locksmithd[1668]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 18:36:09.519134 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 18:36:09.519319 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 18:36:09.522927 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 18:36:09.545061 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 18:36:09.548137 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 18:36:09.549712 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 12 18:36:09.550014 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 18:36:09.621280 tar[1615]: linux-amd64/README.md Dec 12 18:36:09.632071 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 18:36:09.646828 containerd[1626]: time="2025-12-12T18:36:09Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 18:36:09.647558 containerd[1626]: time="2025-12-12T18:36:09.647506318Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 18:36:09.657404 containerd[1626]: time="2025-12-12T18:36:09.657373634Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.46µs" Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.657700530Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.657720516Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.657831300Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.657841345Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.657858429Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.657923472Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.657936654Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.658093471Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.658102517Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.658109348Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.658114848Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658542 containerd[1626]: time="2025-12-12T18:36:09.658160536Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658726 containerd[1626]: time="2025-12-12T18:36:09.658282176Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658726 containerd[1626]: time="2025-12-12T18:36:09.658297825Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 18:36:09.658726 containerd[1626]: time="2025-12-12T18:36:09.658303900Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 18:36:09.658726 containerd[1626]: time="2025-12-12T18:36:09.658331421Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 18:36:09.658726 containerd[1626]: time="2025-12-12T18:36:09.658450864Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 18:36:09.658726 containerd[1626]: time="2025-12-12T18:36:09.658487264Z" level=info msg="metadata content store policy set" policy=shared Dec 12 18:36:09.660480 containerd[1626]: time="2025-12-12T18:36:09.660461683Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 18:36:09.660835 containerd[1626]: time="2025-12-12T18:36:09.660822117Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 18:36:09.660877 containerd[1626]: time="2025-12-12T18:36:09.660869452Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 18:36:09.660923 containerd[1626]: time="2025-12-12T18:36:09.660914615Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 18:36:09.660969 containerd[1626]: time="2025-12-12T18:36:09.660960201Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 18:36:09.661007 containerd[1626]: time="2025-12-12T18:36:09.661000253Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 18:36:09.661041 containerd[1626]: time="2025-12-12T18:36:09.661034424Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 18:36:09.661081 containerd[1626]: time="2025-12-12T18:36:09.661073331Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 18:36:09.661113 containerd[1626]: time="2025-12-12T18:36:09.661107188Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 18:36:09.661338 containerd[1626]: time="2025-12-12T18:36:09.661328202Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.661866591Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.661884494Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.661967632Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.661983169Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.661994494Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662003568Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662013092Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662022603Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662031466Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662040903Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662049599Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662058021Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662064256Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662095911Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 18:36:09.662808 containerd[1626]: time="2025-12-12T18:36:09.662105724Z" level=info msg="Start snapshots syncer" Dec 12 18:36:09.663078 containerd[1626]: time="2025-12-12T18:36:09.662124902Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 18:36:09.663078 containerd[1626]: time="2025-12-12T18:36:09.662311008Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662342654Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662372084Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662433178Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662468544Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662481051Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662488987Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662497921Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662504548Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662513395Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662530354Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662537425Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662543715Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662558711Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 18:36:09.663170 containerd[1626]: time="2025-12-12T18:36:09.662569330Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662577939Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662586771Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662591979Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662597006Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662610363Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662623104Z" level=info msg="runtime interface created" Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662629642Z" level=info msg="created NRI interface" Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662638690Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662646973Z" level=info msg="Connect containerd service" Dec 12 18:36:09.663366 containerd[1626]: time="2025-12-12T18:36:09.662658919Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 18:36:09.663933 containerd[1626]: time="2025-12-12T18:36:09.663918590Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 18:36:09.924584 containerd[1626]: time="2025-12-12T18:36:09.924515833Z" level=info msg="Start subscribing containerd event" Dec 12 18:36:09.926017 containerd[1626]: time="2025-12-12T18:36:09.925986375Z" level=info msg="Start recovering state" Dec 12 18:36:09.926172 containerd[1626]: time="2025-12-12T18:36:09.926152117Z" level=info msg="Start event monitor" Dec 12 18:36:09.926336 containerd[1626]: time="2025-12-12T18:36:09.924733546Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 18:36:09.926542 containerd[1626]: time="2025-12-12T18:36:09.926528002Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 18:36:09.926591 containerd[1626]: time="2025-12-12T18:36:09.926482594Z" level=info msg="Start cni network conf syncer for default" Dec 12 18:36:09.926646 containerd[1626]: time="2025-12-12T18:36:09.926637121Z" level=info msg="Start streaming server" Dec 12 18:36:09.926686 containerd[1626]: time="2025-12-12T18:36:09.926679052Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 18:36:09.926722 containerd[1626]: time="2025-12-12T18:36:09.926715319Z" level=info msg="runtime interface starting up..." Dec 12 18:36:09.926755 containerd[1626]: time="2025-12-12T18:36:09.926749192Z" level=info msg="starting plugins..." Dec 12 18:36:09.926832 containerd[1626]: time="2025-12-12T18:36:09.926812875Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 18:36:09.927013 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 18:36:09.927442 containerd[1626]: time="2025-12-12T18:36:09.927430772Z" level=info msg="containerd successfully booted in 0.281006s" Dec 12 18:36:10.091900 systemd-networkd[1303]: ens192: Gained IPv6LL Dec 12 18:36:10.093658 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 18:36:10.094266 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 18:36:10.095772 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Dec 12 18:36:10.108112 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:36:10.109974 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 18:36:10.164187 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 18:36:10.168930 systemd[1]: coreos-metadata.service: Deactivated successfully. Dec 12 18:36:10.169080 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Dec 12 18:36:10.169435 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 18:36:13.616463 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:36:13.617056 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 18:36:13.617739 systemd[1]: Startup finished in 2.699s (kernel) + 7.713s (initrd) + 8.597s (userspace) = 19.009s. Dec 12 18:36:13.628095 (kubelet)[1793]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:36:13.664689 login[1694]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Dec 12 18:36:13.666016 login[1695]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Dec 12 18:36:13.675288 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 18:36:13.677672 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 18:36:13.680555 systemd-logind[1608]: New session 2 of user core. Dec 12 18:36:13.684106 systemd-logind[1608]: New session 1 of user core. Dec 12 18:36:13.696584 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 18:36:13.698351 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 18:36:13.727658 (systemd)[1800]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 18:36:13.729614 systemd-logind[1608]: New session c1 of user core. Dec 12 18:36:14.004765 systemd[1800]: Queued start job for default target default.target. Dec 12 18:36:14.014891 systemd[1800]: Created slice app.slice - User Application Slice. Dec 12 18:36:14.015011 systemd[1800]: Reached target paths.target - Paths. Dec 12 18:36:14.015103 systemd[1800]: Reached target timers.target - Timers. Dec 12 18:36:14.016083 systemd[1800]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 18:36:14.024844 systemd[1800]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 18:36:14.024884 systemd[1800]: Reached target sockets.target - Sockets. Dec 12 18:36:14.024913 systemd[1800]: Reached target basic.target - Basic System. Dec 12 18:36:14.024938 systemd[1800]: Reached target default.target - Main User Target. Dec 12 18:36:14.024965 systemd[1800]: Startup finished in 289ms. Dec 12 18:36:14.025009 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 18:36:14.037300 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 18:36:14.038568 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 18:36:16.609846 kubelet[1793]: E1212 18:36:16.609800 1793 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:36:16.611571 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:36:16.611686 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:36:16.612044 systemd[1]: kubelet.service: Consumed 756ms CPU time, 267.3M memory peak. Dec 12 18:36:26.736573 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 18:36:26.738215 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:36:27.089974 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:36:27.092754 (kubelet)[1843]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:36:27.141428 kubelet[1843]: E1212 18:36:27.141400 1843 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:36:27.144312 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:36:27.144409 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:36:27.144752 systemd[1]: kubelet.service: Consumed 111ms CPU time, 111M memory peak. Dec 12 18:36:37.236521 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 18:36:37.238056 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:36:37.706506 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:36:37.709440 (kubelet)[1858]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:36:37.743285 kubelet[1858]: E1212 18:36:37.743259 1858 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:36:37.744577 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:36:37.744665 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:36:37.745059 systemd[1]: kubelet.service: Consumed 102ms CPU time, 108.3M memory peak. Dec 12 18:36:39.448585 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 18:36:39.449585 systemd[1]: Started sshd@0-139.178.70.101:22-147.75.109.163:44702.service - OpenSSH per-connection server daemon (147.75.109.163:44702). Dec 12 18:36:39.585734 sshd[1866]: Accepted publickey for core from 147.75.109.163 port 44702 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:36:39.586713 sshd-session[1866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:36:39.590002 systemd-logind[1608]: New session 3 of user core. Dec 12 18:36:39.602053 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 18:36:39.656950 systemd[1]: Started sshd@1-139.178.70.101:22-147.75.109.163:44714.service - OpenSSH per-connection server daemon (147.75.109.163:44714). Dec 12 18:36:39.703390 sshd[1872]: Accepted publickey for core from 147.75.109.163 port 44714 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:36:39.704758 sshd-session[1872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:36:39.708272 systemd-logind[1608]: New session 4 of user core. Dec 12 18:36:39.718962 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 18:36:39.767448 sshd[1875]: Connection closed by 147.75.109.163 port 44714 Dec 12 18:36:39.767736 sshd-session[1872]: pam_unix(sshd:session): session closed for user core Dec 12 18:36:39.772866 systemd[1]: sshd@1-139.178.70.101:22-147.75.109.163:44714.service: Deactivated successfully. Dec 12 18:36:39.773857 systemd[1]: session-4.scope: Deactivated successfully. Dec 12 18:36:39.774623 systemd-logind[1608]: Session 4 logged out. Waiting for processes to exit. Dec 12 18:36:39.775663 systemd[1]: Started sshd@2-139.178.70.101:22-147.75.109.163:44728.service - OpenSSH per-connection server daemon (147.75.109.163:44728). Dec 12 18:36:39.778230 systemd-logind[1608]: Removed session 4. Dec 12 18:36:39.812050 sshd[1881]: Accepted publickey for core from 147.75.109.163 port 44728 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:36:39.812947 sshd-session[1881]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:36:39.816429 systemd-logind[1608]: New session 5 of user core. Dec 12 18:36:39.825899 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 18:36:39.872741 sshd[1884]: Connection closed by 147.75.109.163 port 44728 Dec 12 18:36:39.873104 sshd-session[1881]: pam_unix(sshd:session): session closed for user core Dec 12 18:36:39.884378 systemd[1]: sshd@2-139.178.70.101:22-147.75.109.163:44728.service: Deactivated successfully. Dec 12 18:36:39.885610 systemd[1]: session-5.scope: Deactivated successfully. Dec 12 18:36:39.886439 systemd-logind[1608]: Session 5 logged out. Waiting for processes to exit. Dec 12 18:36:39.887861 systemd-logind[1608]: Removed session 5. Dec 12 18:36:39.889035 systemd[1]: Started sshd@3-139.178.70.101:22-147.75.109.163:44732.service - OpenSSH per-connection server daemon (147.75.109.163:44732). Dec 12 18:36:39.938244 sshd[1890]: Accepted publickey for core from 147.75.109.163 port 44732 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:36:39.939182 sshd-session[1890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:36:39.942586 systemd-logind[1608]: New session 6 of user core. Dec 12 18:36:39.949946 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 18:36:39.998946 sshd[1893]: Connection closed by 147.75.109.163 port 44732 Dec 12 18:36:39.999794 sshd-session[1890]: pam_unix(sshd:session): session closed for user core Dec 12 18:36:40.005011 systemd[1]: sshd@3-139.178.70.101:22-147.75.109.163:44732.service: Deactivated successfully. Dec 12 18:36:40.005978 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 18:36:40.006919 systemd-logind[1608]: Session 6 logged out. Waiting for processes to exit. Dec 12 18:36:40.008090 systemd[1]: Started sshd@4-139.178.70.101:22-147.75.109.163:44748.service - OpenSSH per-connection server daemon (147.75.109.163:44748). Dec 12 18:36:40.009095 systemd-logind[1608]: Removed session 6. Dec 12 18:36:40.052822 sshd[1899]: Accepted publickey for core from 147.75.109.163 port 44748 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:36:40.053557 sshd-session[1899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:36:40.056151 systemd-logind[1608]: New session 7 of user core. Dec 12 18:36:40.065926 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 18:36:40.190415 sudo[1903]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 18:36:40.190584 sudo[1903]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 18:36:40.203141 sudo[1903]: pam_unix(sudo:session): session closed for user root Dec 12 18:36:40.203976 sshd[1902]: Connection closed by 147.75.109.163 port 44748 Dec 12 18:36:40.204341 sshd-session[1899]: pam_unix(sshd:session): session closed for user core Dec 12 18:36:40.214299 systemd[1]: sshd@4-139.178.70.101:22-147.75.109.163:44748.service: Deactivated successfully. Dec 12 18:36:40.215643 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 18:36:40.216169 systemd-logind[1608]: Session 7 logged out. Waiting for processes to exit. Dec 12 18:36:40.217957 systemd[1]: Started sshd@5-139.178.70.101:22-147.75.109.163:44750.service - OpenSSH per-connection server daemon (147.75.109.163:44750). Dec 12 18:36:40.218356 systemd-logind[1608]: Removed session 7. Dec 12 18:36:40.259233 sshd[1909]: Accepted publickey for core from 147.75.109.163 port 44750 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:36:40.259868 sshd-session[1909]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:36:40.263829 systemd-logind[1608]: New session 8 of user core. Dec 12 18:36:40.272971 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 18:36:40.322409 sudo[1914]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 18:36:40.322596 sudo[1914]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 18:36:40.325342 sudo[1914]: pam_unix(sudo:session): session closed for user root Dec 12 18:36:40.328659 sudo[1913]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 18:36:40.328974 sudo[1913]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 18:36:40.335465 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 18:36:40.363088 augenrules[1936]: No rules Dec 12 18:36:40.363954 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 18:36:40.364119 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 18:36:40.364911 sudo[1913]: pam_unix(sudo:session): session closed for user root Dec 12 18:36:40.365729 sshd[1912]: Connection closed by 147.75.109.163 port 44750 Dec 12 18:36:40.366004 sshd-session[1909]: pam_unix(sshd:session): session closed for user core Dec 12 18:36:40.376136 systemd[1]: sshd@5-139.178.70.101:22-147.75.109.163:44750.service: Deactivated successfully. Dec 12 18:36:40.377138 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 18:36:40.377606 systemd-logind[1608]: Session 8 logged out. Waiting for processes to exit. Dec 12 18:36:40.379100 systemd[1]: Started sshd@6-139.178.70.101:22-147.75.109.163:44762.service - OpenSSH per-connection server daemon (147.75.109.163:44762). Dec 12 18:36:40.379503 systemd-logind[1608]: Removed session 8. Dec 12 18:36:40.420407 sshd[1945]: Accepted publickey for core from 147.75.109.163 port 44762 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:36:40.421182 sshd-session[1945]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:36:40.424444 systemd-logind[1608]: New session 9 of user core. Dec 12 18:36:40.429881 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 18:36:40.477895 sudo[1949]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 18:36:40.478324 sudo[1949]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 18:36:40.832698 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 18:36:40.841080 (dockerd)[1967]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 18:36:41.089247 dockerd[1967]: time="2025-12-12T18:36:41.089180623Z" level=info msg="Starting up" Dec 12 18:36:41.090017 dockerd[1967]: time="2025-12-12T18:36:41.090006990Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 18:36:41.097999 dockerd[1967]: time="2025-12-12T18:36:41.097978823Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 18:36:41.146697 dockerd[1967]: time="2025-12-12T18:36:41.146676054Z" level=info msg="Loading containers: start." Dec 12 18:36:41.155796 kernel: Initializing XFRM netlink socket Dec 12 18:36:41.353070 systemd-networkd[1303]: docker0: Link UP Dec 12 18:36:41.354521 dockerd[1967]: time="2025-12-12T18:36:41.354503779Z" level=info msg="Loading containers: done." Dec 12 18:36:41.362206 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3160409916-merged.mount: Deactivated successfully. Dec 12 18:36:41.362565 dockerd[1967]: time="2025-12-12T18:36:41.362546282Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 18:36:41.362648 dockerd[1967]: time="2025-12-12T18:36:41.362638791Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 18:36:41.362727 dockerd[1967]: time="2025-12-12T18:36:41.362718954Z" level=info msg="Initializing buildkit" Dec 12 18:36:41.373150 dockerd[1967]: time="2025-12-12T18:36:41.373109279Z" level=info msg="Completed buildkit initialization" Dec 12 18:36:41.377974 dockerd[1967]: time="2025-12-12T18:36:41.377961975Z" level=info msg="Daemon has completed initialization" Dec 12 18:36:41.378093 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 18:36:41.378309 dockerd[1967]: time="2025-12-12T18:36:41.378286653Z" level=info msg="API listen on /run/docker.sock" Dec 12 18:36:42.203363 containerd[1626]: time="2025-12-12T18:36:42.203334194Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Dec 12 18:36:43.046281 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4003944913.mount: Deactivated successfully. Dec 12 18:36:43.986913 containerd[1626]: time="2025-12-12T18:36:43.986844406Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:43.991627 containerd[1626]: time="2025-12-12T18:36:43.991612310Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=29072183" Dec 12 18:36:43.998777 containerd[1626]: time="2025-12-12T18:36:43.998742709Z" level=info msg="ImageCreate event name:\"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:44.004103 containerd[1626]: time="2025-12-12T18:36:44.004059135Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:44.004835 containerd[1626]: time="2025-12-12T18:36:44.004700221Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"29068782\" in 1.801339026s" Dec 12 18:36:44.004835 containerd[1626]: time="2025-12-12T18:36:44.004729958Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\"" Dec 12 18:36:44.005250 containerd[1626]: time="2025-12-12T18:36:44.005212982Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Dec 12 18:36:45.739506 containerd[1626]: time="2025-12-12T18:36:45.739468331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:45.749267 containerd[1626]: time="2025-12-12T18:36:45.749241799Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=24992010" Dec 12 18:36:45.762004 containerd[1626]: time="2025-12-12T18:36:45.761969665Z" level=info msg="ImageCreate event name:\"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:45.771948 containerd[1626]: time="2025-12-12T18:36:45.771918484Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:45.772616 containerd[1626]: time="2025-12-12T18:36:45.772522876Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"26649046\" in 1.767195242s" Dec 12 18:36:45.772616 containerd[1626]: time="2025-12-12T18:36:45.772545334Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\"" Dec 12 18:36:45.772833 containerd[1626]: time="2025-12-12T18:36:45.772820033Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Dec 12 18:36:46.963428 containerd[1626]: time="2025-12-12T18:36:46.963390228Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:46.968043 containerd[1626]: time="2025-12-12T18:36:46.968010164Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=19404248" Dec 12 18:36:46.975586 containerd[1626]: time="2025-12-12T18:36:46.975548097Z" level=info msg="ImageCreate event name:\"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:46.984879 containerd[1626]: time="2025-12-12T18:36:46.984850931Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:46.985524 containerd[1626]: time="2025-12-12T18:36:46.985453434Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"21061302\" in 1.212569287s" Dec 12 18:36:46.985524 containerd[1626]: time="2025-12-12T18:36:46.985470922Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\"" Dec 12 18:36:46.985928 containerd[1626]: time="2025-12-12T18:36:46.985851535Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Dec 12 18:36:47.928814 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 18:36:47.930481 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:36:48.166446 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1666326332.mount: Deactivated successfully. Dec 12 18:36:48.196033 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:36:48.198948 (kubelet)[2253]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 18:36:48.238995 kubelet[2253]: E1212 18:36:48.238796 2253 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 18:36:48.240107 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 18:36:48.240191 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 18:36:48.240504 systemd[1]: kubelet.service: Consumed 94ms CPU time, 108.2M memory peak. Dec 12 18:36:48.738190 containerd[1626]: time="2025-12-12T18:36:48.738150195Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:48.752726 containerd[1626]: time="2025-12-12T18:36:48.752696116Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=31161423" Dec 12 18:36:48.758395 containerd[1626]: time="2025-12-12T18:36:48.757751063Z" level=info msg="ImageCreate event name:\"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:48.768165 containerd[1626]: time="2025-12-12T18:36:48.768143771Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:48.768584 containerd[1626]: time="2025-12-12T18:36:48.768563547Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"31160442\" in 1.7826949s" Dec 12 18:36:48.768645 containerd[1626]: time="2025-12-12T18:36:48.768634477Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\"" Dec 12 18:36:48.769024 containerd[1626]: time="2025-12-12T18:36:48.769004122Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Dec 12 18:36:49.610739 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1524344505.mount: Deactivated successfully. Dec 12 18:36:50.443801 containerd[1626]: time="2025-12-12T18:36:50.443553079Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:50.451039 containerd[1626]: time="2025-12-12T18:36:50.451017251Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Dec 12 18:36:50.460674 containerd[1626]: time="2025-12-12T18:36:50.460643344Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:50.470179 containerd[1626]: time="2025-12-12T18:36:50.470144094Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:50.471718 containerd[1626]: time="2025-12-12T18:36:50.471569640Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.702542259s" Dec 12 18:36:50.471718 containerd[1626]: time="2025-12-12T18:36:50.471593212Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Dec 12 18:36:50.472155 containerd[1626]: time="2025-12-12T18:36:50.472059701Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 12 18:36:50.949509 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1606942727.mount: Deactivated successfully. Dec 12 18:36:50.951628 containerd[1626]: time="2025-12-12T18:36:50.951213182Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 18:36:50.951628 containerd[1626]: time="2025-12-12T18:36:50.951585678Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Dec 12 18:36:50.951628 containerd[1626]: time="2025-12-12T18:36:50.951608438Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 18:36:50.952663 containerd[1626]: time="2025-12-12T18:36:50.952651642Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 18:36:50.953085 containerd[1626]: time="2025-12-12T18:36:50.953070809Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 480.97516ms" Dec 12 18:36:50.953117 containerd[1626]: time="2025-12-12T18:36:50.953086871Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Dec 12 18:36:50.953414 containerd[1626]: time="2025-12-12T18:36:50.953392650Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Dec 12 18:36:51.557432 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2814980769.mount: Deactivated successfully. Dec 12 18:36:53.357394 containerd[1626]: time="2025-12-12T18:36:53.357337629Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:53.358356 containerd[1626]: time="2025-12-12T18:36:53.358341717Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Dec 12 18:36:53.358660 containerd[1626]: time="2025-12-12T18:36:53.358643123Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:53.360569 containerd[1626]: time="2025-12-12T18:36:53.360553094Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:36:53.361159 containerd[1626]: time="2025-12-12T18:36:53.361143205Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.4077319s" Dec 12 18:36:53.361189 containerd[1626]: time="2025-12-12T18:36:53.361166170Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Dec 12 18:36:54.620867 update_engine[1609]: I20251212 18:36:54.620817 1609 update_attempter.cc:509] Updating boot flags... Dec 12 18:36:55.498483 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:36:55.498855 systemd[1]: kubelet.service: Consumed 94ms CPU time, 108.2M memory peak. Dec 12 18:36:55.506297 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:36:55.517676 systemd[1]: Reload requested from client PID 2419 ('systemctl') (unit session-9.scope)... Dec 12 18:36:55.517693 systemd[1]: Reloading... Dec 12 18:36:55.567807 zram_generator::config[2464]: No configuration found. Dec 12 18:36:55.648230 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Dec 12 18:36:55.716302 systemd[1]: Reloading finished in 198 ms. Dec 12 18:36:55.750341 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 12 18:36:55.750473 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 12 18:36:55.750681 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:36:55.753921 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:36:56.057915 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:36:56.066135 (kubelet)[2531]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 18:36:56.091996 kubelet[2531]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 18:36:56.091996 kubelet[2531]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 18:36:56.091996 kubelet[2531]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 18:36:56.091996 kubelet[2531]: I1212 18:36:56.090226 2531 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 18:36:56.255027 kubelet[2531]: I1212 18:36:56.255003 2531 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 12 18:36:56.255027 kubelet[2531]: I1212 18:36:56.255023 2531 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 18:36:56.255188 kubelet[2531]: I1212 18:36:56.255177 2531 server.go:954] "Client rotation is on, will bootstrap in background" Dec 12 18:36:56.309477 kubelet[2531]: I1212 18:36:56.309419 2531 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 18:36:56.313798 kubelet[2531]: E1212 18:36:56.313188 2531 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.101:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:56.320363 kubelet[2531]: I1212 18:36:56.320298 2531 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 18:36:56.323250 kubelet[2531]: I1212 18:36:56.323238 2531 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 18:36:56.327066 kubelet[2531]: I1212 18:36:56.327041 2531 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 18:36:56.327181 kubelet[2531]: I1212 18:36:56.327064 2531 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 18:36:56.328442 kubelet[2531]: I1212 18:36:56.328428 2531 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 18:36:56.328442 kubelet[2531]: I1212 18:36:56.328441 2531 container_manager_linux.go:304] "Creating device plugin manager" Dec 12 18:36:56.329731 kubelet[2531]: I1212 18:36:56.329717 2531 state_mem.go:36] "Initialized new in-memory state store" Dec 12 18:36:56.332796 kubelet[2531]: I1212 18:36:56.332779 2531 kubelet.go:446] "Attempting to sync node with API server" Dec 12 18:36:56.332826 kubelet[2531]: I1212 18:36:56.332802 2531 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 18:36:56.332826 kubelet[2531]: I1212 18:36:56.332815 2531 kubelet.go:352] "Adding apiserver pod source" Dec 12 18:36:56.332826 kubelet[2531]: I1212 18:36:56.332821 2531 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 18:36:56.337396 kubelet[2531]: W1212 18:36:56.337139 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Dec 12 18:36:56.337396 kubelet[2531]: E1212 18:36:56.337171 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:56.337396 kubelet[2531]: W1212 18:36:56.337357 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Dec 12 18:36:56.337396 kubelet[2531]: E1212 18:36:56.337375 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:56.338860 kubelet[2531]: I1212 18:36:56.338849 2531 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 18:36:56.341249 kubelet[2531]: I1212 18:36:56.341241 2531 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 18:36:56.341322 kubelet[2531]: W1212 18:36:56.341316 2531 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 18:36:56.342652 kubelet[2531]: I1212 18:36:56.342643 2531 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 18:36:56.342706 kubelet[2531]: I1212 18:36:56.342701 2531 server.go:1287] "Started kubelet" Dec 12 18:36:56.344406 kubelet[2531]: I1212 18:36:56.344084 2531 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 18:36:56.345512 kubelet[2531]: I1212 18:36:56.345112 2531 server.go:479] "Adding debug handlers to kubelet server" Dec 12 18:36:56.345914 kubelet[2531]: I1212 18:36:56.345884 2531 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 18:36:56.346081 kubelet[2531]: I1212 18:36:56.346073 2531 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 18:36:56.352306 kubelet[2531]: E1212 18:36:56.347838 2531 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.101:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.101:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18808baa091ad488 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-12-12 18:36:56.342688904 +0000 UTC m=+0.274428449,LastTimestamp:2025-12-12 18:36:56.342688904 +0000 UTC m=+0.274428449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Dec 12 18:36:56.352590 kubelet[2531]: I1212 18:36:56.352584 2531 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 18:36:56.355170 kubelet[2531]: I1212 18:36:56.352660 2531 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 18:36:56.355295 kubelet[2531]: I1212 18:36:56.355289 2531 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 18:36:56.355524 kubelet[2531]: I1212 18:36:56.355518 2531 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 18:36:56.355675 kubelet[2531]: I1212 18:36:56.355621 2531 reconciler.go:26] "Reconciler: start to sync state" Dec 12 18:36:56.355950 kubelet[2531]: W1212 18:36:56.355919 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Dec 12 18:36:56.356008 kubelet[2531]: E1212 18:36:56.355999 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:56.356315 kubelet[2531]: E1212 18:36:56.356270 2531 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Dec 12 18:36:56.356395 kubelet[2531]: E1212 18:36:56.356384 2531 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.101:6443: connect: connection refused" interval="200ms" Dec 12 18:36:56.385682 kubelet[2531]: I1212 18:36:56.385656 2531 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 18:36:56.386637 kubelet[2531]: I1212 18:36:56.386467 2531 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 18:36:56.386637 kubelet[2531]: I1212 18:36:56.386485 2531 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 12 18:36:56.386637 kubelet[2531]: I1212 18:36:56.386497 2531 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 18:36:56.386637 kubelet[2531]: I1212 18:36:56.386501 2531 kubelet.go:2382] "Starting kubelet main sync loop" Dec 12 18:36:56.386637 kubelet[2531]: E1212 18:36:56.386526 2531 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 18:36:56.389938 kubelet[2531]: W1212 18:36:56.389917 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Dec 12 18:36:56.390005 kubelet[2531]: E1212 18:36:56.389996 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:56.390139 kubelet[2531]: I1212 18:36:56.390130 2531 factory.go:221] Registration of the systemd container factory successfully Dec 12 18:36:56.390214 kubelet[2531]: I1212 18:36:56.390205 2531 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 18:36:56.390664 kubelet[2531]: I1212 18:36:56.390656 2531 factory.go:221] Registration of the containerd container factory successfully Dec 12 18:36:56.409294 kubelet[2531]: E1212 18:36:56.409272 2531 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 18:36:56.413278 kubelet[2531]: I1212 18:36:56.413264 2531 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 18:36:56.413278 kubelet[2531]: I1212 18:36:56.413277 2531 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 18:36:56.413332 kubelet[2531]: I1212 18:36:56.413290 2531 state_mem.go:36] "Initialized new in-memory state store" Dec 12 18:36:56.425626 kubelet[2531]: I1212 18:36:56.425608 2531 policy_none.go:49] "None policy: Start" Dec 12 18:36:56.425626 kubelet[2531]: I1212 18:36:56.425627 2531 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 18:36:56.425690 kubelet[2531]: I1212 18:36:56.425636 2531 state_mem.go:35] "Initializing new in-memory state store" Dec 12 18:36:56.451740 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 18:36:56.456567 kubelet[2531]: E1212 18:36:56.456550 2531 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Dec 12 18:36:56.460674 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 18:36:56.464482 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 18:36:56.472330 kubelet[2531]: I1212 18:36:56.472316 2531 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 18:36:56.472432 kubelet[2531]: I1212 18:36:56.472421 2531 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 18:36:56.472459 kubelet[2531]: I1212 18:36:56.472430 2531 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 18:36:56.472909 kubelet[2531]: I1212 18:36:56.472646 2531 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 18:36:56.473487 kubelet[2531]: E1212 18:36:56.473475 2531 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 18:36:56.473521 kubelet[2531]: E1212 18:36:56.473512 2531 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Dec 12 18:36:56.493293 systemd[1]: Created slice kubepods-burstable-podc5cb32fef9f4c71fe4bba9b9f7b28ac3.slice - libcontainer container kubepods-burstable-podc5cb32fef9f4c71fe4bba9b9f7b28ac3.slice. Dec 12 18:36:56.502408 kubelet[2531]: E1212 18:36:56.502383 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:56.503947 systemd[1]: Created slice kubepods-burstable-pod55d9ac750f8c9141f337af8b08cf5c9d.slice - libcontainer container kubepods-burstable-pod55d9ac750f8c9141f337af8b08cf5c9d.slice. Dec 12 18:36:56.505891 kubelet[2531]: E1212 18:36:56.505602 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:56.508047 systemd[1]: Created slice kubepods-burstable-pod0a68423804124305a9de061f38780871.slice - libcontainer container kubepods-burstable-pod0a68423804124305a9de061f38780871.slice. Dec 12 18:36:56.509327 kubelet[2531]: E1212 18:36:56.509234 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:56.556667 kubelet[2531]: I1212 18:36:56.556641 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:36:56.686504 kubelet[2531]: I1212 18:36:56.556792 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5cb32fef9f4c71fe4bba9b9f7b28ac3-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"c5cb32fef9f4c71fe4bba9b9f7b28ac3\") " pod="kube-system/kube-apiserver-localhost" Dec 12 18:36:56.686504 kubelet[2531]: I1212 18:36:56.556817 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5cb32fef9f4c71fe4bba9b9f7b28ac3-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"c5cb32fef9f4c71fe4bba9b9f7b28ac3\") " pod="kube-system/kube-apiserver-localhost" Dec 12 18:36:56.686504 kubelet[2531]: I1212 18:36:56.556831 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:36:56.686504 kubelet[2531]: I1212 18:36:56.556845 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:36:56.686504 kubelet[2531]: I1212 18:36:56.556857 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5cb32fef9f4c71fe4bba9b9f7b28ac3-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"c5cb32fef9f4c71fe4bba9b9f7b28ac3\") " pod="kube-system/kube-apiserver-localhost" Dec 12 18:36:56.686661 kubelet[2531]: I1212 18:36:56.556869 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:36:56.686661 kubelet[2531]: I1212 18:36:56.556880 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:36:56.686661 kubelet[2531]: I1212 18:36:56.556896 2531 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0a68423804124305a9de061f38780871-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0a68423804124305a9de061f38780871\") " pod="kube-system/kube-scheduler-localhost" Dec 12 18:36:56.686661 kubelet[2531]: E1212 18:36:56.556986 2531 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.101:6443: connect: connection refused" interval="400ms" Dec 12 18:36:56.686661 kubelet[2531]: I1212 18:36:56.574011 2531 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Dec 12 18:36:56.686661 kubelet[2531]: E1212 18:36:56.574249 2531 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.101:6443/api/v1/nodes\": dial tcp 139.178.70.101:6443: connect: connection refused" node="localhost" Dec 12 18:36:56.775587 kubelet[2531]: I1212 18:36:56.775360 2531 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Dec 12 18:36:56.775730 kubelet[2531]: E1212 18:36:56.775592 2531 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.101:6443/api/v1/nodes\": dial tcp 139.178.70.101:6443: connect: connection refused" node="localhost" Dec 12 18:36:56.804795 containerd[1626]: time="2025-12-12T18:36:56.804666919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:c5cb32fef9f4c71fe4bba9b9f7b28ac3,Namespace:kube-system,Attempt:0,}" Dec 12 18:36:56.806820 containerd[1626]: time="2025-12-12T18:36:56.806739147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:55d9ac750f8c9141f337af8b08cf5c9d,Namespace:kube-system,Attempt:0,}" Dec 12 18:36:56.810258 containerd[1626]: time="2025-12-12T18:36:56.810238514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0a68423804124305a9de061f38780871,Namespace:kube-system,Attempt:0,}" Dec 12 18:36:56.944931 containerd[1626]: time="2025-12-12T18:36:56.944703111Z" level=info msg="connecting to shim 6728fd0f3f174fd919a9d0eadbe1a1475f2d79b6a1445ee19cddf792c110d642" address="unix:///run/containerd/s/e1d2ed65ce62d30727cf000d750708ad1627c90cce5dd1d3201aa644d26ed0c6" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:36:56.945373 containerd[1626]: time="2025-12-12T18:36:56.945274683Z" level=info msg="connecting to shim 39f4bb66b61ada41fa9ed46d011ec3351602b7f8c6f7e139307bc35a0765b7a3" address="unix:///run/containerd/s/acdcd08b74d148a9f84fd005df13bae9d5def801e3061ab4a48a477e54b360e6" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:36:56.951037 containerd[1626]: time="2025-12-12T18:36:56.951013538Z" level=info msg="connecting to shim 0b8bea4423b426070fb904a400c1a63e6fde6a270d0a9e557eb9acd4adf38a50" address="unix:///run/containerd/s/e0620acf54df334441d7b870be19d4f2f9435ffda5963bae74692e6510f1854f" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:36:56.958021 kubelet[2531]: E1212 18:36:56.957995 2531 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.101:6443: connect: connection refused" interval="800ms" Dec 12 18:36:57.094964 systemd[1]: Started cri-containerd-0b8bea4423b426070fb904a400c1a63e6fde6a270d0a9e557eb9acd4adf38a50.scope - libcontainer container 0b8bea4423b426070fb904a400c1a63e6fde6a270d0a9e557eb9acd4adf38a50. Dec 12 18:36:57.098240 systemd[1]: Started cri-containerd-39f4bb66b61ada41fa9ed46d011ec3351602b7f8c6f7e139307bc35a0765b7a3.scope - libcontainer container 39f4bb66b61ada41fa9ed46d011ec3351602b7f8c6f7e139307bc35a0765b7a3. Dec 12 18:36:57.099155 systemd[1]: Started cri-containerd-6728fd0f3f174fd919a9d0eadbe1a1475f2d79b6a1445ee19cddf792c110d642.scope - libcontainer container 6728fd0f3f174fd919a9d0eadbe1a1475f2d79b6a1445ee19cddf792c110d642. Dec 12 18:36:57.165166 containerd[1626]: time="2025-12-12T18:36:57.165145336Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:c5cb32fef9f4c71fe4bba9b9f7b28ac3,Namespace:kube-system,Attempt:0,} returns sandbox id \"6728fd0f3f174fd919a9d0eadbe1a1475f2d79b6a1445ee19cddf792c110d642\"" Dec 12 18:36:57.170589 containerd[1626]: time="2025-12-12T18:36:57.170427567Z" level=info msg="CreateContainer within sandbox \"6728fd0f3f174fd919a9d0eadbe1a1475f2d79b6a1445ee19cddf792c110d642\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 18:36:57.176769 kubelet[2531]: I1212 18:36:57.176756 2531 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Dec 12 18:36:57.178618 containerd[1626]: time="2025-12-12T18:36:57.178601013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0a68423804124305a9de061f38780871,Namespace:kube-system,Attempt:0,} returns sandbox id \"0b8bea4423b426070fb904a400c1a63e6fde6a270d0a9e557eb9acd4adf38a50\"" Dec 12 18:36:57.180133 kubelet[2531]: E1212 18:36:57.180110 2531 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.101:6443/api/v1/nodes\": dial tcp 139.178.70.101:6443: connect: connection refused" node="localhost" Dec 12 18:36:57.180585 containerd[1626]: time="2025-12-12T18:36:57.180553890Z" level=info msg="CreateContainer within sandbox \"0b8bea4423b426070fb904a400c1a63e6fde6a270d0a9e557eb9acd4adf38a50\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 18:36:57.242464 containerd[1626]: time="2025-12-12T18:36:57.242438799Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:55d9ac750f8c9141f337af8b08cf5c9d,Namespace:kube-system,Attempt:0,} returns sandbox id \"39f4bb66b61ada41fa9ed46d011ec3351602b7f8c6f7e139307bc35a0765b7a3\"" Dec 12 18:36:57.243932 containerd[1626]: time="2025-12-12T18:36:57.243916181Z" level=info msg="CreateContainer within sandbox \"39f4bb66b61ada41fa9ed46d011ec3351602b7f8c6f7e139307bc35a0765b7a3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 18:36:57.355688 containerd[1626]: time="2025-12-12T18:36:57.355654509Z" level=info msg="Container dddd9c47171c80bbc9a9848e6f3d682a3282e0a994949667e8bf090a1e16d69d: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:36:57.372025 containerd[1626]: time="2025-12-12T18:36:57.371983461Z" level=info msg="Container 7d9b91dc9b6f02eeef5b94fadf7b9f819d9b5d4581b657dcae3af34b93bbe7f0: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:36:57.387487 kubelet[2531]: W1212 18:36:57.387439 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Dec 12 18:36:57.387566 kubelet[2531]: E1212 18:36:57.387494 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.101:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:57.407127 kubelet[2531]: W1212 18:36:57.407071 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Dec 12 18:36:57.407127 kubelet[2531]: E1212 18:36:57.407127 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.101:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:57.458817 containerd[1626]: time="2025-12-12T18:36:57.458789771Z" level=info msg="CreateContainer within sandbox \"6728fd0f3f174fd919a9d0eadbe1a1475f2d79b6a1445ee19cddf792c110d642\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"dddd9c47171c80bbc9a9848e6f3d682a3282e0a994949667e8bf090a1e16d69d\"" Dec 12 18:36:57.461522 containerd[1626]: time="2025-12-12T18:36:57.461491796Z" level=info msg="StartContainer for \"dddd9c47171c80bbc9a9848e6f3d682a3282e0a994949667e8bf090a1e16d69d\"" Dec 12 18:36:57.462333 containerd[1626]: time="2025-12-12T18:36:57.462305516Z" level=info msg="connecting to shim dddd9c47171c80bbc9a9848e6f3d682a3282e0a994949667e8bf090a1e16d69d" address="unix:///run/containerd/s/e1d2ed65ce62d30727cf000d750708ad1627c90cce5dd1d3201aa644d26ed0c6" protocol=ttrpc version=3 Dec 12 18:36:57.472541 containerd[1626]: time="2025-12-12T18:36:57.472476919Z" level=info msg="Container e740f205cccb5dd40fc97ceef0c5178094a8a56d949526b060c834c2a74d0a5f: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:36:57.473872 systemd[1]: Started cri-containerd-dddd9c47171c80bbc9a9848e6f3d682a3282e0a994949667e8bf090a1e16d69d.scope - libcontainer container dddd9c47171c80bbc9a9848e6f3d682a3282e0a994949667e8bf090a1e16d69d. Dec 12 18:36:57.485039 containerd[1626]: time="2025-12-12T18:36:57.485007033Z" level=info msg="CreateContainer within sandbox \"39f4bb66b61ada41fa9ed46d011ec3351602b7f8c6f7e139307bc35a0765b7a3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e740f205cccb5dd40fc97ceef0c5178094a8a56d949526b060c834c2a74d0a5f\"" Dec 12 18:36:57.485224 containerd[1626]: time="2025-12-12T18:36:57.485204526Z" level=info msg="CreateContainer within sandbox \"0b8bea4423b426070fb904a400c1a63e6fde6a270d0a9e557eb9acd4adf38a50\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"7d9b91dc9b6f02eeef5b94fadf7b9f819d9b5d4581b657dcae3af34b93bbe7f0\"" Dec 12 18:36:57.485997 containerd[1626]: time="2025-12-12T18:36:57.485978074Z" level=info msg="StartContainer for \"e740f205cccb5dd40fc97ceef0c5178094a8a56d949526b060c834c2a74d0a5f\"" Dec 12 18:36:57.486880 containerd[1626]: time="2025-12-12T18:36:57.486851855Z" level=info msg="StartContainer for \"7d9b91dc9b6f02eeef5b94fadf7b9f819d9b5d4581b657dcae3af34b93bbe7f0\"" Dec 12 18:36:57.487733 containerd[1626]: time="2025-12-12T18:36:57.487714247Z" level=info msg="connecting to shim 7d9b91dc9b6f02eeef5b94fadf7b9f819d9b5d4581b657dcae3af34b93bbe7f0" address="unix:///run/containerd/s/e0620acf54df334441d7b870be19d4f2f9435ffda5963bae74692e6510f1854f" protocol=ttrpc version=3 Dec 12 18:36:57.489161 containerd[1626]: time="2025-12-12T18:36:57.489140638Z" level=info msg="connecting to shim e740f205cccb5dd40fc97ceef0c5178094a8a56d949526b060c834c2a74d0a5f" address="unix:///run/containerd/s/acdcd08b74d148a9f84fd005df13bae9d5def801e3061ab4a48a477e54b360e6" protocol=ttrpc version=3 Dec 12 18:36:57.506936 systemd[1]: Started cri-containerd-e740f205cccb5dd40fc97ceef0c5178094a8a56d949526b060c834c2a74d0a5f.scope - libcontainer container e740f205cccb5dd40fc97ceef0c5178094a8a56d949526b060c834c2a74d0a5f. Dec 12 18:36:57.514935 systemd[1]: Started cri-containerd-7d9b91dc9b6f02eeef5b94fadf7b9f819d9b5d4581b657dcae3af34b93bbe7f0.scope - libcontainer container 7d9b91dc9b6f02eeef5b94fadf7b9f819d9b5d4581b657dcae3af34b93bbe7f0. Dec 12 18:36:57.534835 containerd[1626]: time="2025-12-12T18:36:57.534804613Z" level=info msg="StartContainer for \"dddd9c47171c80bbc9a9848e6f3d682a3282e0a994949667e8bf090a1e16d69d\" returns successfully" Dec 12 18:36:57.556106 containerd[1626]: time="2025-12-12T18:36:57.556073628Z" level=info msg="StartContainer for \"e740f205cccb5dd40fc97ceef0c5178094a8a56d949526b060c834c2a74d0a5f\" returns successfully" Dec 12 18:36:57.574682 containerd[1626]: time="2025-12-12T18:36:57.574656620Z" level=info msg="StartContainer for \"7d9b91dc9b6f02eeef5b94fadf7b9f819d9b5d4581b657dcae3af34b93bbe7f0\" returns successfully" Dec 12 18:36:57.591434 kubelet[2531]: W1212 18:36:57.591367 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Dec 12 18:36:57.591434 kubelet[2531]: E1212 18:36:57.591410 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.101:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:57.723393 kubelet[2531]: W1212 18:36:57.723333 2531 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.101:6443: connect: connection refused Dec 12 18:36:57.723393 kubelet[2531]: E1212 18:36:57.723378 2531 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.101:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.101:6443: connect: connection refused" logger="UnhandledError" Dec 12 18:36:57.759146 kubelet[2531]: E1212 18:36:57.759076 2531 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.101:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.101:6443: connect: connection refused" interval="1.6s" Dec 12 18:36:57.981203 kubelet[2531]: I1212 18:36:57.981049 2531 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Dec 12 18:36:57.981314 kubelet[2531]: E1212 18:36:57.981242 2531 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.101:6443/api/v1/nodes\": dial tcp 139.178.70.101:6443: connect: connection refused" node="localhost" Dec 12 18:36:58.434353 kubelet[2531]: E1212 18:36:58.434326 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:58.436984 kubelet[2531]: E1212 18:36:58.436959 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:58.437951 kubelet[2531]: E1212 18:36:58.437935 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:59.365307 kubelet[2531]: E1212 18:36:59.365265 2531 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Dec 12 18:36:59.440112 kubelet[2531]: E1212 18:36:59.440018 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:59.440112 kubelet[2531]: E1212 18:36:59.440078 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:59.440393 kubelet[2531]: E1212 18:36:59.440297 2531 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Dec 12 18:36:59.582987 kubelet[2531]: I1212 18:36:59.582968 2531 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Dec 12 18:36:59.588327 kubelet[2531]: I1212 18:36:59.588298 2531 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Dec 12 18:36:59.588327 kubelet[2531]: E1212 18:36:59.588329 2531 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Dec 12 18:36:59.656924 kubelet[2531]: I1212 18:36:59.656517 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Dec 12 18:36:59.667499 kubelet[2531]: E1212 18:36:59.667465 2531 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Dec 12 18:36:59.667499 kubelet[2531]: I1212 18:36:59.667487 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Dec 12 18:36:59.668577 kubelet[2531]: E1212 18:36:59.668562 2531 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Dec 12 18:36:59.668577 kubelet[2531]: I1212 18:36:59.668584 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Dec 12 18:36:59.669711 kubelet[2531]: E1212 18:36:59.669694 2531 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:00.339137 kubelet[2531]: I1212 18:37:00.339117 2531 apiserver.go:52] "Watching apiserver" Dec 12 18:37:00.356332 kubelet[2531]: I1212 18:37:00.356310 2531 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 18:37:00.440658 kubelet[2531]: I1212 18:37:00.440640 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Dec 12 18:37:00.441908 kubelet[2531]: I1212 18:37:00.440901 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:00.442072 kubelet[2531]: I1212 18:37:00.441977 2531 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Dec 12 18:37:01.210865 systemd[1]: Reload requested from client PID 2805 ('systemctl') (unit session-9.scope)... Dec 12 18:37:01.210876 systemd[1]: Reloading... Dec 12 18:37:01.259821 zram_generator::config[2848]: No configuration found. Dec 12 18:37:01.342932 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Dec 12 18:37:01.419183 systemd[1]: Reloading finished in 208 ms. Dec 12 18:37:01.434521 kubelet[2531]: I1212 18:37:01.434093 2531 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 18:37:01.434417 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:37:01.449004 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 18:37:01.449208 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:37:01.449242 systemd[1]: kubelet.service: Consumed 424ms CPU time, 124.8M memory peak. Dec 12 18:37:01.451052 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 18:37:01.948664 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 18:37:01.956105 (kubelet)[2916]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 18:37:02.047355 kubelet[2916]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 18:37:02.047355 kubelet[2916]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 18:37:02.047355 kubelet[2916]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 18:37:02.047590 kubelet[2916]: I1212 18:37:02.047393 2916 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 18:37:02.051206 kubelet[2916]: I1212 18:37:02.051186 2916 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 12 18:37:02.051206 kubelet[2916]: I1212 18:37:02.051197 2916 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 18:37:02.051330 kubelet[2916]: I1212 18:37:02.051320 2916 server.go:954] "Client rotation is on, will bootstrap in background" Dec 12 18:37:02.053643 kubelet[2916]: I1212 18:37:02.052536 2916 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 12 18:37:02.054310 kubelet[2916]: I1212 18:37:02.054299 2916 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 18:37:02.058008 kubelet[2916]: I1212 18:37:02.057778 2916 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 18:37:02.060628 kubelet[2916]: I1212 18:37:02.060617 2916 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 18:37:02.060854 kubelet[2916]: I1212 18:37:02.060836 2916 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 18:37:02.060996 kubelet[2916]: I1212 18:37:02.060896 2916 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 18:37:02.061072 kubelet[2916]: I1212 18:37:02.061066 2916 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 18:37:02.061110 kubelet[2916]: I1212 18:37:02.061105 2916 container_manager_linux.go:304] "Creating device plugin manager" Dec 12 18:37:02.061196 kubelet[2916]: I1212 18:37:02.061191 2916 state_mem.go:36] "Initialized new in-memory state store" Dec 12 18:37:02.061346 kubelet[2916]: I1212 18:37:02.061340 2916 kubelet.go:446] "Attempting to sync node with API server" Dec 12 18:37:02.061399 kubelet[2916]: I1212 18:37:02.061393 2916 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 18:37:02.061441 kubelet[2916]: I1212 18:37:02.061436 2916 kubelet.go:352] "Adding apiserver pod source" Dec 12 18:37:02.061474 kubelet[2916]: I1212 18:37:02.061469 2916 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 18:37:02.062089 kubelet[2916]: I1212 18:37:02.062078 2916 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 18:37:02.063942 kubelet[2916]: I1212 18:37:02.063898 2916 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 12 18:37:02.064158 kubelet[2916]: I1212 18:37:02.064146 2916 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 18:37:02.064245 kubelet[2916]: I1212 18:37:02.064165 2916 server.go:1287] "Started kubelet" Dec 12 18:37:02.065804 kubelet[2916]: I1212 18:37:02.065265 2916 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 18:37:02.069400 kubelet[2916]: I1212 18:37:02.069375 2916 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 18:37:02.070527 kubelet[2916]: I1212 18:37:02.070394 2916 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 18:37:02.076369 kubelet[2916]: I1212 18:37:02.076172 2916 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 18:37:02.076440 kubelet[2916]: E1212 18:37:02.076398 2916 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Dec 12 18:37:02.077415 kubelet[2916]: I1212 18:37:02.076998 2916 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 18:37:02.077415 kubelet[2916]: I1212 18:37:02.077066 2916 reconciler.go:26] "Reconciler: start to sync state" Dec 12 18:37:02.080097 kubelet[2916]: I1212 18:37:02.080087 2916 server.go:479] "Adding debug handlers to kubelet server" Dec 12 18:37:02.080743 kubelet[2916]: I1212 18:37:02.080717 2916 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 18:37:02.082173 kubelet[2916]: I1212 18:37:02.082161 2916 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 18:37:02.083889 kubelet[2916]: I1212 18:37:02.083778 2916 factory.go:221] Registration of the systemd container factory successfully Dec 12 18:37:02.083889 kubelet[2916]: I1212 18:37:02.083835 2916 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 18:37:02.086636 kubelet[2916]: I1212 18:37:02.086619 2916 factory.go:221] Registration of the containerd container factory successfully Dec 12 18:37:02.093025 kubelet[2916]: E1212 18:37:02.093003 2916 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 18:37:02.095384 kubelet[2916]: I1212 18:37:02.095259 2916 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 12 18:37:02.096770 kubelet[2916]: I1212 18:37:02.096761 2916 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 12 18:37:02.097016 kubelet[2916]: I1212 18:37:02.096871 2916 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 12 18:37:02.097016 kubelet[2916]: I1212 18:37:02.096888 2916 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 18:37:02.097016 kubelet[2916]: I1212 18:37:02.096893 2916 kubelet.go:2382] "Starting kubelet main sync loop" Dec 12 18:37:02.097016 kubelet[2916]: E1212 18:37:02.096918 2916 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 18:37:02.135794 kubelet[2916]: I1212 18:37:02.135765 2916 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 18:37:02.136102 kubelet[2916]: I1212 18:37:02.135881 2916 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 18:37:02.136102 kubelet[2916]: I1212 18:37:02.135895 2916 state_mem.go:36] "Initialized new in-memory state store" Dec 12 18:37:02.136102 kubelet[2916]: I1212 18:37:02.135988 2916 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 18:37:02.136102 kubelet[2916]: I1212 18:37:02.135996 2916 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 18:37:02.136102 kubelet[2916]: I1212 18:37:02.136008 2916 policy_none.go:49] "None policy: Start" Dec 12 18:37:02.136102 kubelet[2916]: I1212 18:37:02.136013 2916 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 18:37:02.136102 kubelet[2916]: I1212 18:37:02.136019 2916 state_mem.go:35] "Initializing new in-memory state store" Dec 12 18:37:02.136102 kubelet[2916]: I1212 18:37:02.136077 2916 state_mem.go:75] "Updated machine memory state" Dec 12 18:37:02.140222 kubelet[2916]: I1212 18:37:02.140209 2916 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 12 18:37:02.140889 kubelet[2916]: I1212 18:37:02.140877 2916 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 18:37:02.140955 kubelet[2916]: I1212 18:37:02.140936 2916 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 18:37:02.141289 kubelet[2916]: I1212 18:37:02.141130 2916 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 18:37:02.142083 kubelet[2916]: E1212 18:37:02.142056 2916 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 18:37:02.197568 kubelet[2916]: I1212 18:37:02.197540 2916 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:02.207087 kubelet[2916]: E1212 18:37:02.206911 2916 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:02.208721 kubelet[2916]: I1212 18:37:02.208707 2916 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Dec 12 18:37:02.208824 kubelet[2916]: I1212 18:37:02.208808 2916 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Dec 12 18:37:02.229507 kubelet[2916]: E1212 18:37:02.229482 2916 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Dec 12 18:37:02.230180 kubelet[2916]: E1212 18:37:02.230159 2916 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Dec 12 18:37:02.242573 kubelet[2916]: I1212 18:37:02.242547 2916 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Dec 12 18:37:02.277092 kubelet[2916]: I1212 18:37:02.277003 2916 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Dec 12 18:37:02.277092 kubelet[2916]: I1212 18:37:02.277065 2916 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Dec 12 18:37:02.278965 kubelet[2916]: I1212 18:37:02.278884 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5cb32fef9f4c71fe4bba9b9f7b28ac3-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"c5cb32fef9f4c71fe4bba9b9f7b28ac3\") " pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:02.278965 kubelet[2916]: I1212 18:37:02.278928 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5cb32fef9f4c71fe4bba9b9f7b28ac3-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"c5cb32fef9f4c71fe4bba9b9f7b28ac3\") " pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:02.278965 kubelet[2916]: I1212 18:37:02.278945 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:37:02.278965 kubelet[2916]: I1212 18:37:02.278959 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0a68423804124305a9de061f38780871-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0a68423804124305a9de061f38780871\") " pod="kube-system/kube-scheduler-localhost" Dec 12 18:37:02.279088 kubelet[2916]: I1212 18:37:02.278982 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5cb32fef9f4c71fe4bba9b9f7b28ac3-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"c5cb32fef9f4c71fe4bba9b9f7b28ac3\") " pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:02.279088 kubelet[2916]: I1212 18:37:02.279000 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:37:02.279088 kubelet[2916]: I1212 18:37:02.279013 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:37:02.279088 kubelet[2916]: I1212 18:37:02.279025 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:37:02.279088 kubelet[2916]: I1212 18:37:02.279036 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/55d9ac750f8c9141f337af8b08cf5c9d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"55d9ac750f8c9141f337af8b08cf5c9d\") " pod="kube-system/kube-controller-manager-localhost" Dec 12 18:37:03.062552 kubelet[2916]: I1212 18:37:03.062403 2916 apiserver.go:52] "Watching apiserver" Dec 12 18:37:03.125417 kubelet[2916]: I1212 18:37:03.125396 2916 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:03.133776 kubelet[2916]: E1212 18:37:03.133750 2916 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Dec 12 18:37:03.177352 kubelet[2916]: I1212 18:37:03.177323 2916 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 18:37:03.189716 kubelet[2916]: I1212 18:37:03.189368 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=3.189350839 podStartE2EDuration="3.189350839s" podCreationTimestamp="2025-12-12 18:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:37:03.173177932 +0000 UTC m=+1.204540614" watchObservedRunningTime="2025-12-12 18:37:03.189350839 +0000 UTC m=+1.220713516" Dec 12 18:37:03.189716 kubelet[2916]: I1212 18:37:03.189481 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.189473448 podStartE2EDuration="3.189473448s" podCreationTimestamp="2025-12-12 18:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:37:03.18933776 +0000 UTC m=+1.220700443" watchObservedRunningTime="2025-12-12 18:37:03.189473448 +0000 UTC m=+1.220836125" Dec 12 18:37:03.214909 kubelet[2916]: I1212 18:37:03.214421 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.214412898 podStartE2EDuration="3.214412898s" podCreationTimestamp="2025-12-12 18:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:37:03.214362965 +0000 UTC m=+1.245725639" watchObservedRunningTime="2025-12-12 18:37:03.214412898 +0000 UTC m=+1.245775574" Dec 12 18:37:06.482513 kubelet[2916]: I1212 18:37:06.482485 2916 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 18:37:06.482954 containerd[1626]: time="2025-12-12T18:37:06.482937209Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 18:37:06.483358 kubelet[2916]: I1212 18:37:06.483337 2916 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 18:37:07.173707 systemd[1]: Created slice kubepods-besteffort-pod1638031c_631d_4329_996f_2de5eec90533.slice - libcontainer container kubepods-besteffort-pod1638031c_631d_4329_996f_2de5eec90533.slice. Dec 12 18:37:07.209996 kubelet[2916]: I1212 18:37:07.209715 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1638031c-631d-4329-996f-2de5eec90533-kube-proxy\") pod \"kube-proxy-68wc6\" (UID: \"1638031c-631d-4329-996f-2de5eec90533\") " pod="kube-system/kube-proxy-68wc6" Dec 12 18:37:07.209996 kubelet[2916]: I1212 18:37:07.209737 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v4tj\" (UniqueName: \"kubernetes.io/projected/1638031c-631d-4329-996f-2de5eec90533-kube-api-access-2v4tj\") pod \"kube-proxy-68wc6\" (UID: \"1638031c-631d-4329-996f-2de5eec90533\") " pod="kube-system/kube-proxy-68wc6" Dec 12 18:37:07.209996 kubelet[2916]: I1212 18:37:07.209757 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1638031c-631d-4329-996f-2de5eec90533-xtables-lock\") pod \"kube-proxy-68wc6\" (UID: \"1638031c-631d-4329-996f-2de5eec90533\") " pod="kube-system/kube-proxy-68wc6" Dec 12 18:37:07.209996 kubelet[2916]: I1212 18:37:07.209768 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1638031c-631d-4329-996f-2de5eec90533-lib-modules\") pod \"kube-proxy-68wc6\" (UID: \"1638031c-631d-4329-996f-2de5eec90533\") " pod="kube-system/kube-proxy-68wc6" Dec 12 18:37:07.315214 kubelet[2916]: E1212 18:37:07.314844 2916 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Dec 12 18:37:07.315214 kubelet[2916]: E1212 18:37:07.314864 2916 projected.go:194] Error preparing data for projected volume kube-api-access-2v4tj for pod kube-system/kube-proxy-68wc6: configmap "kube-root-ca.crt" not found Dec 12 18:37:07.315214 kubelet[2916]: E1212 18:37:07.314901 2916 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1638031c-631d-4329-996f-2de5eec90533-kube-api-access-2v4tj podName:1638031c-631d-4329-996f-2de5eec90533 nodeName:}" failed. No retries permitted until 2025-12-12 18:37:07.81488793 +0000 UTC m=+5.846250603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2v4tj" (UniqueName: "kubernetes.io/projected/1638031c-631d-4329-996f-2de5eec90533-kube-api-access-2v4tj") pod "kube-proxy-68wc6" (UID: "1638031c-631d-4329-996f-2de5eec90533") : configmap "kube-root-ca.crt" not found Dec 12 18:37:07.548495 systemd[1]: Created slice kubepods-besteffort-pod933df11c_6541_4f3b_9ff9_1320ce3f5196.slice - libcontainer container kubepods-besteffort-pod933df11c_6541_4f3b_9ff9_1320ce3f5196.slice. Dec 12 18:37:07.612012 kubelet[2916]: I1212 18:37:07.611986 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/933df11c-6541-4f3b-9ff9-1320ce3f5196-var-lib-calico\") pod \"tigera-operator-7dcd859c48-v6tqt\" (UID: \"933df11c-6541-4f3b-9ff9-1320ce3f5196\") " pod="tigera-operator/tigera-operator-7dcd859c48-v6tqt" Dec 12 18:37:07.612012 kubelet[2916]: I1212 18:37:07.612008 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7zjs\" (UniqueName: \"kubernetes.io/projected/933df11c-6541-4f3b-9ff9-1320ce3f5196-kube-api-access-c7zjs\") pod \"tigera-operator-7dcd859c48-v6tqt\" (UID: \"933df11c-6541-4f3b-9ff9-1320ce3f5196\") " pod="tigera-operator/tigera-operator-7dcd859c48-v6tqt" Dec 12 18:37:07.851020 containerd[1626]: time="2025-12-12T18:37:07.850951520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-v6tqt,Uid:933df11c-6541-4f3b-9ff9-1320ce3f5196,Namespace:tigera-operator,Attempt:0,}" Dec 12 18:37:07.864086 containerd[1626]: time="2025-12-12T18:37:07.864058549Z" level=info msg="connecting to shim 3bbf2c97dbae002f4d71019909b4fceaa8248bca41fa26fe700e9028488e4a85" address="unix:///run/containerd/s/8dee64bd63d062bcf8f8001539ab10004871de21118f56f886faffe528b2fae3" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:07.886881 systemd[1]: Started cri-containerd-3bbf2c97dbae002f4d71019909b4fceaa8248bca41fa26fe700e9028488e4a85.scope - libcontainer container 3bbf2c97dbae002f4d71019909b4fceaa8248bca41fa26fe700e9028488e4a85. Dec 12 18:37:07.926940 containerd[1626]: time="2025-12-12T18:37:07.926864600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-v6tqt,Uid:933df11c-6541-4f3b-9ff9-1320ce3f5196,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3bbf2c97dbae002f4d71019909b4fceaa8248bca41fa26fe700e9028488e4a85\"" Dec 12 18:37:07.929043 containerd[1626]: time="2025-12-12T18:37:07.928405006Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 18:37:08.083755 containerd[1626]: time="2025-12-12T18:37:08.083724817Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-68wc6,Uid:1638031c-631d-4329-996f-2de5eec90533,Namespace:kube-system,Attempt:0,}" Dec 12 18:37:08.129353 containerd[1626]: time="2025-12-12T18:37:08.129021624Z" level=info msg="connecting to shim 574e1d411afb29600626a731c4f93016174ec717439a82939b60371888739317" address="unix:///run/containerd/s/013c75ed1cc36a85a56982b671c2f41d2f1f196605e478713c4a133c3f4394f9" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:08.147926 systemd[1]: Started cri-containerd-574e1d411afb29600626a731c4f93016174ec717439a82939b60371888739317.scope - libcontainer container 574e1d411afb29600626a731c4f93016174ec717439a82939b60371888739317. Dec 12 18:37:08.166732 containerd[1626]: time="2025-12-12T18:37:08.166705864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-68wc6,Uid:1638031c-631d-4329-996f-2de5eec90533,Namespace:kube-system,Attempt:0,} returns sandbox id \"574e1d411afb29600626a731c4f93016174ec717439a82939b60371888739317\"" Dec 12 18:37:08.168465 containerd[1626]: time="2025-12-12T18:37:08.168449257Z" level=info msg="CreateContainer within sandbox \"574e1d411afb29600626a731c4f93016174ec717439a82939b60371888739317\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 18:37:08.185921 containerd[1626]: time="2025-12-12T18:37:08.185893503Z" level=info msg="Container eb2107af71552066133d888e61875bf02096ce206e535e5e161cfeb8e796146f: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:37:08.189078 containerd[1626]: time="2025-12-12T18:37:08.189054693Z" level=info msg="CreateContainer within sandbox \"574e1d411afb29600626a731c4f93016174ec717439a82939b60371888739317\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"eb2107af71552066133d888e61875bf02096ce206e535e5e161cfeb8e796146f\"" Dec 12 18:37:08.189459 containerd[1626]: time="2025-12-12T18:37:08.189426769Z" level=info msg="StartContainer for \"eb2107af71552066133d888e61875bf02096ce206e535e5e161cfeb8e796146f\"" Dec 12 18:37:08.191204 containerd[1626]: time="2025-12-12T18:37:08.190600117Z" level=info msg="connecting to shim eb2107af71552066133d888e61875bf02096ce206e535e5e161cfeb8e796146f" address="unix:///run/containerd/s/013c75ed1cc36a85a56982b671c2f41d2f1f196605e478713c4a133c3f4394f9" protocol=ttrpc version=3 Dec 12 18:37:08.207893 systemd[1]: Started cri-containerd-eb2107af71552066133d888e61875bf02096ce206e535e5e161cfeb8e796146f.scope - libcontainer container eb2107af71552066133d888e61875bf02096ce206e535e5e161cfeb8e796146f. Dec 12 18:37:08.281937 containerd[1626]: time="2025-12-12T18:37:08.281904403Z" level=info msg="StartContainer for \"eb2107af71552066133d888e61875bf02096ce206e535e5e161cfeb8e796146f\" returns successfully" Dec 12 18:37:09.160514 kubelet[2916]: I1212 18:37:09.160391 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-68wc6" podStartSLOduration=2.160376954 podStartE2EDuration="2.160376954s" podCreationTimestamp="2025-12-12 18:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:37:09.160300187 +0000 UTC m=+7.191662860" watchObservedRunningTime="2025-12-12 18:37:09.160376954 +0000 UTC m=+7.191739631" Dec 12 18:37:09.420179 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount701650227.mount: Deactivated successfully. Dec 12 18:37:10.194591 containerd[1626]: time="2025-12-12T18:37:10.194555744Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:10.197467 containerd[1626]: time="2025-12-12T18:37:10.197449507Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Dec 12 18:37:10.205452 containerd[1626]: time="2025-12-12T18:37:10.205423331Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:10.211347 containerd[1626]: time="2025-12-12T18:37:10.211323421Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:10.211794 containerd[1626]: time="2025-12-12T18:37:10.211626588Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.282457618s" Dec 12 18:37:10.211794 containerd[1626]: time="2025-12-12T18:37:10.211646355Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 12 18:37:10.213907 containerd[1626]: time="2025-12-12T18:37:10.213889342Z" level=info msg="CreateContainer within sandbox \"3bbf2c97dbae002f4d71019909b4fceaa8248bca41fa26fe700e9028488e4a85\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 18:37:10.242231 containerd[1626]: time="2025-12-12T18:37:10.242195893Z" level=info msg="Container d384697ffe0c49b179339162ac6fbfd62ff481b1b257c64ab089affdda3d8b8a: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:37:10.244331 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3456510886.mount: Deactivated successfully. Dec 12 18:37:10.246627 containerd[1626]: time="2025-12-12T18:37:10.246607924Z" level=info msg="CreateContainer within sandbox \"3bbf2c97dbae002f4d71019909b4fceaa8248bca41fa26fe700e9028488e4a85\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d384697ffe0c49b179339162ac6fbfd62ff481b1b257c64ab089affdda3d8b8a\"" Dec 12 18:37:10.246991 containerd[1626]: time="2025-12-12T18:37:10.246976704Z" level=info msg="StartContainer for \"d384697ffe0c49b179339162ac6fbfd62ff481b1b257c64ab089affdda3d8b8a\"" Dec 12 18:37:10.248412 containerd[1626]: time="2025-12-12T18:37:10.248389863Z" level=info msg="connecting to shim d384697ffe0c49b179339162ac6fbfd62ff481b1b257c64ab089affdda3d8b8a" address="unix:///run/containerd/s/8dee64bd63d062bcf8f8001539ab10004871de21118f56f886faffe528b2fae3" protocol=ttrpc version=3 Dec 12 18:37:10.265882 systemd[1]: Started cri-containerd-d384697ffe0c49b179339162ac6fbfd62ff481b1b257c64ab089affdda3d8b8a.scope - libcontainer container d384697ffe0c49b179339162ac6fbfd62ff481b1b257c64ab089affdda3d8b8a. Dec 12 18:37:10.281807 containerd[1626]: time="2025-12-12T18:37:10.281776601Z" level=info msg="StartContainer for \"d384697ffe0c49b179339162ac6fbfd62ff481b1b257c64ab089affdda3d8b8a\" returns successfully" Dec 12 18:37:11.143799 kubelet[2916]: I1212 18:37:11.143687 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-v6tqt" podStartSLOduration=1.859121573 podStartE2EDuration="4.143476053s" podCreationTimestamp="2025-12-12 18:37:07 +0000 UTC" firstStartedPulling="2025-12-12 18:37:07.927774243 +0000 UTC m=+5.959136918" lastFinishedPulling="2025-12-12 18:37:10.212128727 +0000 UTC m=+8.243491398" observedRunningTime="2025-12-12 18:37:11.143349541 +0000 UTC m=+9.174712218" watchObservedRunningTime="2025-12-12 18:37:11.143476053 +0000 UTC m=+9.174838727" Dec 12 18:37:15.893198 sudo[1949]: pam_unix(sudo:session): session closed for user root Dec 12 18:37:15.894252 sshd[1948]: Connection closed by 147.75.109.163 port 44762 Dec 12 18:37:15.895329 sshd-session[1945]: pam_unix(sshd:session): session closed for user core Dec 12 18:37:15.897439 systemd[1]: sshd@6-139.178.70.101:22-147.75.109.163:44762.service: Deactivated successfully. Dec 12 18:37:15.899291 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 18:37:15.899653 systemd[1]: session-9.scope: Consumed 3.088s CPU time, 154.8M memory peak. Dec 12 18:37:15.900687 systemd-logind[1608]: Session 9 logged out. Waiting for processes to exit. Dec 12 18:37:15.903554 systemd-logind[1608]: Removed session 9. Dec 12 18:37:20.040084 systemd[1]: Created slice kubepods-besteffort-podb1c35a63_45dd_45ad_a2f3_d561fc0ddae1.slice - libcontainer container kubepods-besteffort-podb1c35a63_45dd_45ad_a2f3_d561fc0ddae1.slice. Dec 12 18:37:20.095178 kubelet[2916]: I1212 18:37:20.095138 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c35a63-45dd-45ad-a2f3-d561fc0ddae1-tigera-ca-bundle\") pod \"calico-typha-6c75b45c8-kkpgm\" (UID: \"b1c35a63-45dd-45ad-a2f3-d561fc0ddae1\") " pod="calico-system/calico-typha-6c75b45c8-kkpgm" Dec 12 18:37:20.095178 kubelet[2916]: I1212 18:37:20.095174 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b1c35a63-45dd-45ad-a2f3-d561fc0ddae1-typha-certs\") pod \"calico-typha-6c75b45c8-kkpgm\" (UID: \"b1c35a63-45dd-45ad-a2f3-d561fc0ddae1\") " pod="calico-system/calico-typha-6c75b45c8-kkpgm" Dec 12 18:37:20.095178 kubelet[2916]: I1212 18:37:20.095190 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvc4x\" (UniqueName: \"kubernetes.io/projected/b1c35a63-45dd-45ad-a2f3-d561fc0ddae1-kube-api-access-bvc4x\") pod \"calico-typha-6c75b45c8-kkpgm\" (UID: \"b1c35a63-45dd-45ad-a2f3-d561fc0ddae1\") " pod="calico-system/calico-typha-6c75b45c8-kkpgm" Dec 12 18:37:20.175001 systemd[1]: Created slice kubepods-besteffort-pode8cef2ea_b857_4519_8fb6_e2a4196bd121.slice - libcontainer container kubepods-besteffort-pode8cef2ea_b857_4519_8fb6_e2a4196bd121.slice. Dec 12 18:37:20.196388 kubelet[2916]: I1212 18:37:20.196330 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-cni-net-dir\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.196589 kubelet[2916]: I1212 18:37:20.196477 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8cef2ea-b857-4519-8fb6-e2a4196bd121-tigera-ca-bundle\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.196589 kubelet[2916]: I1212 18:37:20.196496 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-var-run-calico\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.196740 kubelet[2916]: I1212 18:37:20.196679 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-lib-modules\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.196740 kubelet[2916]: I1212 18:37:20.196699 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-var-lib-calico\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.196740 kubelet[2916]: I1212 18:37:20.196714 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrvxs\" (UniqueName: \"kubernetes.io/projected/e8cef2ea-b857-4519-8fb6-e2a4196bd121-kube-api-access-zrvxs\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.197024 kubelet[2916]: I1212 18:37:20.196886 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-policysync\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.197024 kubelet[2916]: I1212 18:37:20.196915 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-cni-log-dir\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.197024 kubelet[2916]: I1212 18:37:20.196931 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e8cef2ea-b857-4519-8fb6-e2a4196bd121-node-certs\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.197024 kubelet[2916]: I1212 18:37:20.196960 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-xtables-lock\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.197024 kubelet[2916]: I1212 18:37:20.196988 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-flexvol-driver-host\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.197172 kubelet[2916]: I1212 18:37:20.197007 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e8cef2ea-b857-4519-8fb6-e2a4196bd121-cni-bin-dir\") pod \"calico-node-xnl9c\" (UID: \"e8cef2ea-b857-4519-8fb6-e2a4196bd121\") " pod="calico-system/calico-node-xnl9c" Dec 12 18:37:20.308867 kubelet[2916]: E1212 18:37:20.308330 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.308867 kubelet[2916]: W1212 18:37:20.308343 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.312192 kubelet[2916]: E1212 18:37:20.312107 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.314263 kubelet[2916]: E1212 18:37:20.314105 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.314263 kubelet[2916]: W1212 18:37:20.314119 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.314263 kubelet[2916]: E1212 18:37:20.314135 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.356297 containerd[1626]: time="2025-12-12T18:37:20.356266876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6c75b45c8-kkpgm,Uid:b1c35a63-45dd-45ad-a2f3-d561fc0ddae1,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:20.387328 kubelet[2916]: E1212 18:37:20.385811 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:20.388036 kubelet[2916]: E1212 18:37:20.387543 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.388036 kubelet[2916]: W1212 18:37:20.387557 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.388036 kubelet[2916]: E1212 18:37:20.387570 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.388036 kubelet[2916]: E1212 18:37:20.387709 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.388036 kubelet[2916]: W1212 18:37:20.387714 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.388036 kubelet[2916]: E1212 18:37:20.387720 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.388036 kubelet[2916]: E1212 18:37:20.387906 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.388036 kubelet[2916]: W1212 18:37:20.387911 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.388036 kubelet[2916]: E1212 18:37:20.387916 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.388858 kubelet[2916]: E1212 18:37:20.388824 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.388858 kubelet[2916]: W1212 18:37:20.388836 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.389414 kubelet[2916]: E1212 18:37:20.388972 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.389724 kubelet[2916]: E1212 18:37:20.389705 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.389771 kubelet[2916]: W1212 18:37:20.389764 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.389841 kubelet[2916]: E1212 18:37:20.389833 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.390153 kubelet[2916]: E1212 18:37:20.390126 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.390288 kubelet[2916]: W1212 18:37:20.390217 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.390337 kubelet[2916]: E1212 18:37:20.390329 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.393660 kubelet[2916]: E1212 18:37:20.393631 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.393842 kubelet[2916]: W1212 18:37:20.393752 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.393842 kubelet[2916]: E1212 18:37:20.393769 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.394221 kubelet[2916]: E1212 18:37:20.394213 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.394275 kubelet[2916]: W1212 18:37:20.394267 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.394312 kubelet[2916]: E1212 18:37:20.394306 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.394435 kubelet[2916]: E1212 18:37:20.394428 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.394528 kubelet[2916]: W1212 18:37:20.394470 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.394528 kubelet[2916]: E1212 18:37:20.394478 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.394631 kubelet[2916]: E1212 18:37:20.394625 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.394670 kubelet[2916]: W1212 18:37:20.394664 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.394738 kubelet[2916]: E1212 18:37:20.394732 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.395042 kubelet[2916]: E1212 18:37:20.394992 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.395042 kubelet[2916]: W1212 18:37:20.394998 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.395042 kubelet[2916]: E1212 18:37:20.395005 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.395681 kubelet[2916]: E1212 18:37:20.395673 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.396860 kubelet[2916]: W1212 18:37:20.395763 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.396860 kubelet[2916]: E1212 18:37:20.395774 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.397981 kubelet[2916]: E1212 18:37:20.397826 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.397981 kubelet[2916]: W1212 18:37:20.397839 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.397981 kubelet[2916]: E1212 18:37:20.397851 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.397981 kubelet[2916]: E1212 18:37:20.397934 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.397981 kubelet[2916]: W1212 18:37:20.397939 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.397981 kubelet[2916]: E1212 18:37:20.397944 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.398362 kubelet[2916]: E1212 18:37:20.398157 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.398362 kubelet[2916]: W1212 18:37:20.398166 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.398362 kubelet[2916]: E1212 18:37:20.398175 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.398362 kubelet[2916]: E1212 18:37:20.398283 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.398362 kubelet[2916]: W1212 18:37:20.398290 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.398362 kubelet[2916]: E1212 18:37:20.398298 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.399793 kubelet[2916]: E1212 18:37:20.399735 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.399793 kubelet[2916]: W1212 18:37:20.399756 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.399793 kubelet[2916]: E1212 18:37:20.399771 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.400556 kubelet[2916]: E1212 18:37:20.400492 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.400556 kubelet[2916]: W1212 18:37:20.400502 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.400556 kubelet[2916]: E1212 18:37:20.400509 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.400748 kubelet[2916]: E1212 18:37:20.400680 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.400748 kubelet[2916]: W1212 18:37:20.400688 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.400748 kubelet[2916]: E1212 18:37:20.400694 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.400874 kubelet[2916]: E1212 18:37:20.400868 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.401065 kubelet[2916]: W1212 18:37:20.400949 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.401065 kubelet[2916]: E1212 18:37:20.400958 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.401217 containerd[1626]: time="2025-12-12T18:37:20.401198356Z" level=info msg="connecting to shim 5c98ec3e6ea6e130341c288521fda3d8f58555e4cbdba6187d6b0dc63b3ae9a0" address="unix:///run/containerd/s/ea630612c74db7d0f393e1e9658aaafbafc71eb4ed4d1150da250369a0340b21" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:20.401282 kubelet[2916]: E1212 18:37:20.401274 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.401324 kubelet[2916]: W1212 18:37:20.401317 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.401437 kubelet[2916]: E1212 18:37:20.401361 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.401437 kubelet[2916]: I1212 18:37:20.401383 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhlst\" (UniqueName: \"kubernetes.io/projected/8031345b-ae64-4c92-b720-42f91263ef98-kube-api-access-hhlst\") pod \"csi-node-driver-kv4lq\" (UID: \"8031345b-ae64-4c92-b720-42f91263ef98\") " pod="calico-system/csi-node-driver-kv4lq" Dec 12 18:37:20.401916 kubelet[2916]: E1212 18:37:20.401819 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.401916 kubelet[2916]: W1212 18:37:20.401831 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.401916 kubelet[2916]: E1212 18:37:20.401842 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.401916 kubelet[2916]: I1212 18:37:20.401856 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8031345b-ae64-4c92-b720-42f91263ef98-registration-dir\") pod \"csi-node-driver-kv4lq\" (UID: \"8031345b-ae64-4c92-b720-42f91263ef98\") " pod="calico-system/csi-node-driver-kv4lq" Dec 12 18:37:20.403067 kubelet[2916]: E1212 18:37:20.402845 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.403067 kubelet[2916]: W1212 18:37:20.402859 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.403067 kubelet[2916]: E1212 18:37:20.402871 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.403067 kubelet[2916]: I1212 18:37:20.402887 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8031345b-ae64-4c92-b720-42f91263ef98-socket-dir\") pod \"csi-node-driver-kv4lq\" (UID: \"8031345b-ae64-4c92-b720-42f91263ef98\") " pod="calico-system/csi-node-driver-kv4lq" Dec 12 18:37:20.403067 kubelet[2916]: E1212 18:37:20.403004 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.403067 kubelet[2916]: W1212 18:37:20.403009 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.403067 kubelet[2916]: E1212 18:37:20.403015 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.403067 kubelet[2916]: I1212 18:37:20.403024 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8031345b-ae64-4c92-b720-42f91263ef98-kubelet-dir\") pod \"csi-node-driver-kv4lq\" (UID: \"8031345b-ae64-4c92-b720-42f91263ef98\") " pod="calico-system/csi-node-driver-kv4lq" Dec 12 18:37:20.404329 kubelet[2916]: E1212 18:37:20.404255 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.404600 kubelet[2916]: W1212 18:37:20.404469 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.404600 kubelet[2916]: E1212 18:37:20.404524 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.404600 kubelet[2916]: I1212 18:37:20.404550 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8031345b-ae64-4c92-b720-42f91263ef98-varrun\") pod \"csi-node-driver-kv4lq\" (UID: \"8031345b-ae64-4c92-b720-42f91263ef98\") " pod="calico-system/csi-node-driver-kv4lq" Dec 12 18:37:20.405227 kubelet[2916]: E1212 18:37:20.405121 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.405227 kubelet[2916]: W1212 18:37:20.405129 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.405227 kubelet[2916]: E1212 18:37:20.405148 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.405495 kubelet[2916]: E1212 18:37:20.405464 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.405677 kubelet[2916]: W1212 18:37:20.405596 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.405677 kubelet[2916]: E1212 18:37:20.405620 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.406112 kubelet[2916]: E1212 18:37:20.405956 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.406112 kubelet[2916]: W1212 18:37:20.405966 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.406112 kubelet[2916]: E1212 18:37:20.405985 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.406112 kubelet[2916]: E1212 18:37:20.406065 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.406112 kubelet[2916]: W1212 18:37:20.406072 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.406112 kubelet[2916]: E1212 18:37:20.406106 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.406286 kubelet[2916]: E1212 18:37:20.406278 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.406377 kubelet[2916]: W1212 18:37:20.406326 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.406377 kubelet[2916]: E1212 18:37:20.406346 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.406805 kubelet[2916]: E1212 18:37:20.406633 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.406805 kubelet[2916]: W1212 18:37:20.406648 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.406805 kubelet[2916]: E1212 18:37:20.406655 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.406805 kubelet[2916]: E1212 18:37:20.406757 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.406805 kubelet[2916]: W1212 18:37:20.406764 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.406805 kubelet[2916]: E1212 18:37:20.406771 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.407133 kubelet[2916]: E1212 18:37:20.407079 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.407133 kubelet[2916]: W1212 18:37:20.407091 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.407262 kubelet[2916]: E1212 18:37:20.407162 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.407686 kubelet[2916]: E1212 18:37:20.407501 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.407686 kubelet[2916]: W1212 18:37:20.407512 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.407686 kubelet[2916]: E1212 18:37:20.407521 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.407838 kubelet[2916]: E1212 18:37:20.407828 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.407838 kubelet[2916]: W1212 18:37:20.407836 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.407886 kubelet[2916]: E1212 18:37:20.407842 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.426904 systemd[1]: Started cri-containerd-5c98ec3e6ea6e130341c288521fda3d8f58555e4cbdba6187d6b0dc63b3ae9a0.scope - libcontainer container 5c98ec3e6ea6e130341c288521fda3d8f58555e4cbdba6187d6b0dc63b3ae9a0. Dec 12 18:37:20.474282 containerd[1626]: time="2025-12-12T18:37:20.474253139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6c75b45c8-kkpgm,Uid:b1c35a63-45dd-45ad-a2f3-d561fc0ddae1,Namespace:calico-system,Attempt:0,} returns sandbox id \"5c98ec3e6ea6e130341c288521fda3d8f58555e4cbdba6187d6b0dc63b3ae9a0\"" Dec 12 18:37:20.476197 containerd[1626]: time="2025-12-12T18:37:20.476089938Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 18:37:20.479267 containerd[1626]: time="2025-12-12T18:37:20.479243523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xnl9c,Uid:e8cef2ea-b857-4519-8fb6-e2a4196bd121,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:20.506394 kubelet[2916]: E1212 18:37:20.505708 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.506394 kubelet[2916]: W1212 18:37:20.505729 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.506394 kubelet[2916]: E1212 18:37:20.505769 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.506394 kubelet[2916]: E1212 18:37:20.505971 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.506394 kubelet[2916]: W1212 18:37:20.505980 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.506394 kubelet[2916]: E1212 18:37:20.505996 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.506394 kubelet[2916]: E1212 18:37:20.506290 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.506394 kubelet[2916]: W1212 18:37:20.506297 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.506394 kubelet[2916]: E1212 18:37:20.506314 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.506970 kubelet[2916]: E1212 18:37:20.506807 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.506970 kubelet[2916]: W1212 18:37:20.506818 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.506970 kubelet[2916]: E1212 18:37:20.506831 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.507082 kubelet[2916]: E1212 18:37:20.506981 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.507082 kubelet[2916]: W1212 18:37:20.506987 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.507082 kubelet[2916]: E1212 18:37:20.507003 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.508084 kubelet[2916]: E1212 18:37:20.507936 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.508084 kubelet[2916]: W1212 18:37:20.507969 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.508084 kubelet[2916]: E1212 18:37:20.507990 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.508337 kubelet[2916]: E1212 18:37:20.508167 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.508337 kubelet[2916]: W1212 18:37:20.508175 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.508337 kubelet[2916]: E1212 18:37:20.508288 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.508530 kubelet[2916]: E1212 18:37:20.508368 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.508530 kubelet[2916]: W1212 18:37:20.508376 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.508530 kubelet[2916]: E1212 18:37:20.508415 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.509574 kubelet[2916]: E1212 18:37:20.508720 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.509574 kubelet[2916]: W1212 18:37:20.508730 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.509574 kubelet[2916]: E1212 18:37:20.508740 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.509770 kubelet[2916]: E1212 18:37:20.509756 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.509855 kubelet[2916]: W1212 18:37:20.509769 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.509855 kubelet[2916]: E1212 18:37:20.509816 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.511062 kubelet[2916]: E1212 18:37:20.510045 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.511062 kubelet[2916]: W1212 18:37:20.510056 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.511062 kubelet[2916]: E1212 18:37:20.510126 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.511062 kubelet[2916]: E1212 18:37:20.510211 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.511062 kubelet[2916]: W1212 18:37:20.510217 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.511062 kubelet[2916]: E1212 18:37:20.510283 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.511062 kubelet[2916]: E1212 18:37:20.510461 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.511062 kubelet[2916]: W1212 18:37:20.510468 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.511062 kubelet[2916]: E1212 18:37:20.510613 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.511062 kubelet[2916]: E1212 18:37:20.510739 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.511402 kubelet[2916]: W1212 18:37:20.510820 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.511402 kubelet[2916]: E1212 18:37:20.510992 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.511402 kubelet[2916]: E1212 18:37:20.511086 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.511402 kubelet[2916]: W1212 18:37:20.511093 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.511402 kubelet[2916]: E1212 18:37:20.511104 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.511537 kubelet[2916]: E1212 18:37:20.511445 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.511537 kubelet[2916]: W1212 18:37:20.511453 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.511537 kubelet[2916]: E1212 18:37:20.511515 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.511854 kubelet[2916]: E1212 18:37:20.511770 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.511946 kubelet[2916]: W1212 18:37:20.511901 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.511946 kubelet[2916]: E1212 18:37:20.511929 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.512448 kubelet[2916]: E1212 18:37:20.512427 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.512448 kubelet[2916]: W1212 18:37:20.512441 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.512586 kubelet[2916]: E1212 18:37:20.512465 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.512979 kubelet[2916]: E1212 18:37:20.512911 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.512979 kubelet[2916]: W1212 18:37:20.512923 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.512979 kubelet[2916]: E1212 18:37:20.512944 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.516188 kubelet[2916]: E1212 18:37:20.513734 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.516188 kubelet[2916]: W1212 18:37:20.513748 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.516188 kubelet[2916]: E1212 18:37:20.514050 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.516188 kubelet[2916]: E1212 18:37:20.514249 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.516188 kubelet[2916]: W1212 18:37:20.514258 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.516188 kubelet[2916]: E1212 18:37:20.514584 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.516188 kubelet[2916]: E1212 18:37:20.515323 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.516188 kubelet[2916]: W1212 18:37:20.515335 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.516188 kubelet[2916]: E1212 18:37:20.515671 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.516188 kubelet[2916]: E1212 18:37:20.516190 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.516890 kubelet[2916]: W1212 18:37:20.516202 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.516890 kubelet[2916]: E1212 18:37:20.516530 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.518107 kubelet[2916]: E1212 18:37:20.517199 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.518107 kubelet[2916]: W1212 18:37:20.517210 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.518107 kubelet[2916]: E1212 18:37:20.517595 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.518107 kubelet[2916]: E1212 18:37:20.517678 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.518107 kubelet[2916]: W1212 18:37:20.517689 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.518107 kubelet[2916]: E1212 18:37:20.517836 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.534409 containerd[1626]: time="2025-12-12T18:37:20.534366459Z" level=info msg="connecting to shim bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972" address="unix:///run/containerd/s/40f4c0ee0dc36be5f4ecadca035d7b375282ce97ebefa13c7f3189ff3b37f101" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:20.534986 kubelet[2916]: E1212 18:37:20.534967 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:20.535066 kubelet[2916]: W1212 18:37:20.534985 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:20.535066 kubelet[2916]: E1212 18:37:20.535004 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:20.555007 systemd[1]: Started cri-containerd-bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972.scope - libcontainer container bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972. Dec 12 18:37:20.583754 containerd[1626]: time="2025-12-12T18:37:20.583549562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xnl9c,Uid:e8cef2ea-b857-4519-8fb6-e2a4196bd121,Namespace:calico-system,Attempt:0,} returns sandbox id \"bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972\"" Dec 12 18:37:22.005964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3742531746.mount: Deactivated successfully. Dec 12 18:37:22.099725 kubelet[2916]: E1212 18:37:22.098069 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:23.004045 containerd[1626]: time="2025-12-12T18:37:23.004008642Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:23.005714 containerd[1626]: time="2025-12-12T18:37:23.005690819Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Dec 12 18:37:23.007802 containerd[1626]: time="2025-12-12T18:37:23.007772661Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:23.008993 containerd[1626]: time="2025-12-12T18:37:23.008974302Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:23.009760 containerd[1626]: time="2025-12-12T18:37:23.009737962Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.533625521s" Dec 12 18:37:23.009760 containerd[1626]: time="2025-12-12T18:37:23.009757989Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 12 18:37:23.014487 containerd[1626]: time="2025-12-12T18:37:23.014467412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 18:37:23.024932 containerd[1626]: time="2025-12-12T18:37:23.024770683Z" level=info msg="CreateContainer within sandbox \"5c98ec3e6ea6e130341c288521fda3d8f58555e4cbdba6187d6b0dc63b3ae9a0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 18:37:23.029956 containerd[1626]: time="2025-12-12T18:37:23.029918482Z" level=info msg="Container 5d0eb2814ef2e323d489b2200a49add35ca22fc774269c637f4ae2535d2a7558: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:37:23.035353 containerd[1626]: time="2025-12-12T18:37:23.035327051Z" level=info msg="CreateContainer within sandbox \"5c98ec3e6ea6e130341c288521fda3d8f58555e4cbdba6187d6b0dc63b3ae9a0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5d0eb2814ef2e323d489b2200a49add35ca22fc774269c637f4ae2535d2a7558\"" Dec 12 18:37:23.035851 containerd[1626]: time="2025-12-12T18:37:23.035779078Z" level=info msg="StartContainer for \"5d0eb2814ef2e323d489b2200a49add35ca22fc774269c637f4ae2535d2a7558\"" Dec 12 18:37:23.036735 containerd[1626]: time="2025-12-12T18:37:23.036713265Z" level=info msg="connecting to shim 5d0eb2814ef2e323d489b2200a49add35ca22fc774269c637f4ae2535d2a7558" address="unix:///run/containerd/s/ea630612c74db7d0f393e1e9658aaafbafc71eb4ed4d1150da250369a0340b21" protocol=ttrpc version=3 Dec 12 18:37:23.056947 systemd[1]: Started cri-containerd-5d0eb2814ef2e323d489b2200a49add35ca22fc774269c637f4ae2535d2a7558.scope - libcontainer container 5d0eb2814ef2e323d489b2200a49add35ca22fc774269c637f4ae2535d2a7558. Dec 12 18:37:23.097686 containerd[1626]: time="2025-12-12T18:37:23.097665955Z" level=info msg="StartContainer for \"5d0eb2814ef2e323d489b2200a49add35ca22fc774269c637f4ae2535d2a7558\" returns successfully" Dec 12 18:37:23.218700 kubelet[2916]: E1212 18:37:23.218672 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.218700 kubelet[2916]: W1212 18:37:23.218693 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.246457 kubelet[2916]: E1212 18:37:23.246428 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.246778 kubelet[2916]: E1212 18:37:23.246765 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.246830 kubelet[2916]: W1212 18:37:23.246777 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.246830 kubelet[2916]: E1212 18:37:23.246799 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.247700 kubelet[2916]: E1212 18:37:23.247688 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.247700 kubelet[2916]: W1212 18:37:23.247699 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.247765 kubelet[2916]: E1212 18:37:23.247714 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.259196 kubelet[2916]: E1212 18:37:23.259128 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.259196 kubelet[2916]: W1212 18:37:23.259143 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.259196 kubelet[2916]: E1212 18:37:23.259158 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.259368 kubelet[2916]: E1212 18:37:23.259358 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.259368 kubelet[2916]: W1212 18:37:23.259366 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.259453 kubelet[2916]: E1212 18:37:23.259373 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.260031 kubelet[2916]: E1212 18:37:23.260020 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.260031 kubelet[2916]: W1212 18:37:23.260027 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.260118 kubelet[2916]: E1212 18:37:23.260033 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.260199 kubelet[2916]: E1212 18:37:23.260188 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.260199 kubelet[2916]: W1212 18:37:23.260196 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.263725 kubelet[2916]: E1212 18:37:23.260202 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.263725 kubelet[2916]: E1212 18:37:23.260652 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.263725 kubelet[2916]: W1212 18:37:23.260657 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.263725 kubelet[2916]: E1212 18:37:23.260663 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.263725 kubelet[2916]: E1212 18:37:23.260768 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.263725 kubelet[2916]: W1212 18:37:23.260773 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.263725 kubelet[2916]: E1212 18:37:23.260778 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.263725 kubelet[2916]: E1212 18:37:23.260877 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.263725 kubelet[2916]: W1212 18:37:23.260882 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.263725 kubelet[2916]: E1212 18:37:23.260887 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.263920 kubelet[2916]: E1212 18:37:23.260973 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.263920 kubelet[2916]: W1212 18:37:23.260978 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.263920 kubelet[2916]: E1212 18:37:23.260985 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.263920 kubelet[2916]: E1212 18:37:23.261073 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.263920 kubelet[2916]: W1212 18:37:23.261079 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.263920 kubelet[2916]: E1212 18:37:23.261087 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.263920 kubelet[2916]: E1212 18:37:23.261158 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.263920 kubelet[2916]: W1212 18:37:23.261163 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.263920 kubelet[2916]: E1212 18:37:23.261167 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.263920 kubelet[2916]: E1212 18:37:23.261241 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264077 kubelet[2916]: W1212 18:37:23.261245 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264077 kubelet[2916]: E1212 18:37:23.261249 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264077 kubelet[2916]: E1212 18:37:23.261403 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264077 kubelet[2916]: W1212 18:37:23.261413 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264077 kubelet[2916]: E1212 18:37:23.261418 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264077 kubelet[2916]: E1212 18:37:23.261557 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264077 kubelet[2916]: W1212 18:37:23.261561 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264077 kubelet[2916]: E1212 18:37:23.261567 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264077 kubelet[2916]: E1212 18:37:23.261734 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264077 kubelet[2916]: W1212 18:37:23.261739 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264240 kubelet[2916]: E1212 18:37:23.261748 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264240 kubelet[2916]: E1212 18:37:23.262907 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264240 kubelet[2916]: W1212 18:37:23.262915 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264240 kubelet[2916]: E1212 18:37:23.262926 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264240 kubelet[2916]: E1212 18:37:23.263140 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264240 kubelet[2916]: W1212 18:37:23.263146 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264240 kubelet[2916]: E1212 18:37:23.263155 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264240 kubelet[2916]: E1212 18:37:23.263250 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264240 kubelet[2916]: W1212 18:37:23.263256 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264240 kubelet[2916]: E1212 18:37:23.263266 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264414 kubelet[2916]: E1212 18:37:23.263378 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264414 kubelet[2916]: W1212 18:37:23.263386 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264414 kubelet[2916]: E1212 18:37:23.263398 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264414 kubelet[2916]: E1212 18:37:23.263478 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264414 kubelet[2916]: W1212 18:37:23.263483 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264414 kubelet[2916]: E1212 18:37:23.263492 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264414 kubelet[2916]: E1212 18:37:23.263579 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264414 kubelet[2916]: W1212 18:37:23.263589 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264414 kubelet[2916]: E1212 18:37:23.263600 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.264414 kubelet[2916]: E1212 18:37:23.263672 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.264563 kubelet[2916]: W1212 18:37:23.263677 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.264563 kubelet[2916]: E1212 18:37:23.263682 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.265100 kubelet[2916]: E1212 18:37:23.264627 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.265100 kubelet[2916]: W1212 18:37:23.264634 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.265100 kubelet[2916]: E1212 18:37:23.264647 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.265100 kubelet[2916]: E1212 18:37:23.264774 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.265100 kubelet[2916]: W1212 18:37:23.264779 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.265100 kubelet[2916]: E1212 18:37:23.264813 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.265100 kubelet[2916]: E1212 18:37:23.264891 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.265100 kubelet[2916]: W1212 18:37:23.264896 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.265100 kubelet[2916]: E1212 18:37:23.264910 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.265100 kubelet[2916]: E1212 18:37:23.264977 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.265381 kubelet[2916]: W1212 18:37:23.264981 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.265381 kubelet[2916]: E1212 18:37:23.265001 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.265381 kubelet[2916]: E1212 18:37:23.265071 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.265381 kubelet[2916]: W1212 18:37:23.265075 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.265381 kubelet[2916]: E1212 18:37:23.265080 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.265615 kubelet[2916]: E1212 18:37:23.265604 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.265615 kubelet[2916]: W1212 18:37:23.265611 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.265671 kubelet[2916]: E1212 18:37:23.265623 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.265841 kubelet[2916]: E1212 18:37:23.265748 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.265841 kubelet[2916]: W1212 18:37:23.265755 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.265841 kubelet[2916]: E1212 18:37:23.265765 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.265996 kubelet[2916]: E1212 18:37:23.265989 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.266111 kubelet[2916]: W1212 18:37:23.266040 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.266111 kubelet[2916]: E1212 18:37:23.266049 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:23.266438 kubelet[2916]: E1212 18:37:23.266420 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:23.266438 kubelet[2916]: W1212 18:37:23.266428 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:23.266438 kubelet[2916]: E1212 18:37:23.266434 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.102194 kubelet[2916]: E1212 18:37:24.101771 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:24.176673 kubelet[2916]: I1212 18:37:24.176630 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 18:37:24.266493 kubelet[2916]: E1212 18:37:24.266471 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.266493 kubelet[2916]: W1212 18:37:24.266486 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484331 kubelet[2916]: E1212 18:37:24.266499 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484331 kubelet[2916]: E1212 18:37:24.266581 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484331 kubelet[2916]: W1212 18:37:24.266585 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484331 kubelet[2916]: E1212 18:37:24.266590 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484331 kubelet[2916]: E1212 18:37:24.266658 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484331 kubelet[2916]: W1212 18:37:24.266663 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484331 kubelet[2916]: E1212 18:37:24.266668 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484331 kubelet[2916]: E1212 18:37:24.266735 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484331 kubelet[2916]: W1212 18:37:24.266739 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484331 kubelet[2916]: E1212 18:37:24.266744 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484548 kubelet[2916]: E1212 18:37:24.266845 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484548 kubelet[2916]: W1212 18:37:24.266849 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484548 kubelet[2916]: E1212 18:37:24.266854 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484548 kubelet[2916]: E1212 18:37:24.266939 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484548 kubelet[2916]: W1212 18:37:24.266943 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484548 kubelet[2916]: E1212 18:37:24.266948 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484548 kubelet[2916]: E1212 18:37:24.267027 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484548 kubelet[2916]: W1212 18:37:24.267037 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484548 kubelet[2916]: E1212 18:37:24.267043 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484548 kubelet[2916]: E1212 18:37:24.267123 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484854 kubelet[2916]: W1212 18:37:24.267127 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484854 kubelet[2916]: E1212 18:37:24.267132 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484854 kubelet[2916]: E1212 18:37:24.267216 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484854 kubelet[2916]: W1212 18:37:24.267220 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484854 kubelet[2916]: E1212 18:37:24.267224 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484854 kubelet[2916]: E1212 18:37:24.267302 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484854 kubelet[2916]: W1212 18:37:24.267307 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.484854 kubelet[2916]: E1212 18:37:24.267315 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.484854 kubelet[2916]: E1212 18:37:24.267392 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.484854 kubelet[2916]: W1212 18:37:24.267397 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485082 kubelet[2916]: E1212 18:37:24.267401 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485082 kubelet[2916]: E1212 18:37:24.267480 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485082 kubelet[2916]: W1212 18:37:24.267485 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485082 kubelet[2916]: E1212 18:37:24.267489 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485082 kubelet[2916]: E1212 18:37:24.267573 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485082 kubelet[2916]: W1212 18:37:24.267577 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485082 kubelet[2916]: E1212 18:37:24.267582 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485082 kubelet[2916]: E1212 18:37:24.267667 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485082 kubelet[2916]: W1212 18:37:24.267671 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485082 kubelet[2916]: E1212 18:37:24.267676 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485290 kubelet[2916]: E1212 18:37:24.267754 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485290 kubelet[2916]: W1212 18:37:24.267759 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485290 kubelet[2916]: E1212 18:37:24.267763 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485290 kubelet[2916]: E1212 18:37:24.267923 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485290 kubelet[2916]: W1212 18:37:24.267930 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485290 kubelet[2916]: E1212 18:37:24.267937 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485290 kubelet[2916]: E1212 18:37:24.268055 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485290 kubelet[2916]: W1212 18:37:24.268060 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485290 kubelet[2916]: E1212 18:37:24.268071 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485290 kubelet[2916]: E1212 18:37:24.268164 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485523 kubelet[2916]: W1212 18:37:24.268169 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485523 kubelet[2916]: E1212 18:37:24.268175 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485523 kubelet[2916]: E1212 18:37:24.268281 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485523 kubelet[2916]: W1212 18:37:24.268286 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485523 kubelet[2916]: E1212 18:37:24.268295 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485523 kubelet[2916]: E1212 18:37:24.268399 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485523 kubelet[2916]: W1212 18:37:24.268403 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.485523 kubelet[2916]: E1212 18:37:24.268413 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.485523 kubelet[2916]: E1212 18:37:24.268504 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.485523 kubelet[2916]: W1212 18:37:24.268510 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486367 kubelet[2916]: E1212 18:37:24.268518 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486367 kubelet[2916]: E1212 18:37:24.268607 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486367 kubelet[2916]: W1212 18:37:24.268612 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486367 kubelet[2916]: E1212 18:37:24.268622 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486367 kubelet[2916]: E1212 18:37:24.268698 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486367 kubelet[2916]: W1212 18:37:24.268702 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486367 kubelet[2916]: E1212 18:37:24.268710 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486367 kubelet[2916]: E1212 18:37:24.268804 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486367 kubelet[2916]: W1212 18:37:24.268809 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486367 kubelet[2916]: E1212 18:37:24.268818 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486719 kubelet[2916]: E1212 18:37:24.268909 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486719 kubelet[2916]: W1212 18:37:24.268914 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486719 kubelet[2916]: E1212 18:37:24.268922 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486719 kubelet[2916]: E1212 18:37:24.269055 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486719 kubelet[2916]: W1212 18:37:24.269060 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486719 kubelet[2916]: E1212 18:37:24.269065 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486719 kubelet[2916]: E1212 18:37:24.269156 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486719 kubelet[2916]: W1212 18:37:24.269160 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486719 kubelet[2916]: E1212 18:37:24.269171 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486719 kubelet[2916]: E1212 18:37:24.269320 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486931 kubelet[2916]: W1212 18:37:24.269324 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486931 kubelet[2916]: E1212 18:37:24.269333 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486931 kubelet[2916]: E1212 18:37:24.269434 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486931 kubelet[2916]: W1212 18:37:24.269439 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486931 kubelet[2916]: E1212 18:37:24.269446 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486931 kubelet[2916]: E1212 18:37:24.269522 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486931 kubelet[2916]: W1212 18:37:24.269526 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.486931 kubelet[2916]: E1212 18:37:24.269530 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.486931 kubelet[2916]: E1212 18:37:24.269650 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.486931 kubelet[2916]: W1212 18:37:24.269654 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.487125 kubelet[2916]: E1212 18:37:24.269663 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.487125 kubelet[2916]: E1212 18:37:24.269821 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.487125 kubelet[2916]: W1212 18:37:24.269826 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.487125 kubelet[2916]: E1212 18:37:24.269836 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.487125 kubelet[2916]: E1212 18:37:24.269923 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 18:37:24.487125 kubelet[2916]: W1212 18:37:24.269927 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 18:37:24.487125 kubelet[2916]: E1212 18:37:24.269932 2916 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 18:37:24.779325 containerd[1626]: time="2025-12-12T18:37:24.779282295Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:24.780171 containerd[1626]: time="2025-12-12T18:37:24.780147711Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Dec 12 18:37:24.780291 containerd[1626]: time="2025-12-12T18:37:24.780272717Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:24.782066 containerd[1626]: time="2025-12-12T18:37:24.782041443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:24.782895 containerd[1626]: time="2025-12-12T18:37:24.782856579Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.768369701s" Dec 12 18:37:24.782895 containerd[1626]: time="2025-12-12T18:37:24.782879725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 12 18:37:24.784232 containerd[1626]: time="2025-12-12T18:37:24.784211750Z" level=info msg="CreateContainer within sandbox \"bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 18:37:24.792727 containerd[1626]: time="2025-12-12T18:37:24.792628968Z" level=info msg="Container da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:37:24.811642 containerd[1626]: time="2025-12-12T18:37:24.811567030Z" level=info msg="CreateContainer within sandbox \"bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f\"" Dec 12 18:37:24.813015 containerd[1626]: time="2025-12-12T18:37:24.812872827Z" level=info msg="StartContainer for \"da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f\"" Dec 12 18:37:24.813991 containerd[1626]: time="2025-12-12T18:37:24.813977996Z" level=info msg="connecting to shim da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f" address="unix:///run/containerd/s/40f4c0ee0dc36be5f4ecadca035d7b375282ce97ebefa13c7f3189ff3b37f101" protocol=ttrpc version=3 Dec 12 18:37:24.841985 systemd[1]: Started cri-containerd-da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f.scope - libcontainer container da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f. Dec 12 18:37:24.910993 systemd[1]: cri-containerd-da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f.scope: Deactivated successfully. Dec 12 18:37:24.914108 containerd[1626]: time="2025-12-12T18:37:24.914074509Z" level=info msg="StartContainer for \"da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f\" returns successfully" Dec 12 18:37:24.959978 containerd[1626]: time="2025-12-12T18:37:24.959882844Z" level=info msg="received container exit event container_id:\"da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f\" id:\"da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f\" pid:3613 exited_at:{seconds:1765564644 nanos:914597988}" Dec 12 18:37:24.975771 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-da23ff71c98a26fef5e60d86735ce554df89de45441cc78b85662b46f73fb50f-rootfs.mount: Deactivated successfully. Dec 12 18:37:25.215930 kubelet[2916]: I1212 18:37:25.215574 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6c75b45c8-kkpgm" podStartSLOduration=2.677210986 podStartE2EDuration="5.215555859s" podCreationTimestamp="2025-12-12 18:37:20 +0000 UTC" firstStartedPulling="2025-12-12 18:37:20.475774106 +0000 UTC m=+18.507136781" lastFinishedPulling="2025-12-12 18:37:23.014118978 +0000 UTC m=+21.045481654" observedRunningTime="2025-12-12 18:37:23.200411766 +0000 UTC m=+21.231774447" watchObservedRunningTime="2025-12-12 18:37:25.215555859 +0000 UTC m=+23.246918541" Dec 12 18:37:26.099805 kubelet[2916]: E1212 18:37:26.099397 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:26.186665 containerd[1626]: time="2025-12-12T18:37:26.186587332Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 18:37:27.125272 kubelet[2916]: I1212 18:37:27.124954 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 12 18:37:28.118091 kubelet[2916]: E1212 18:37:28.118051 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:30.098582 kubelet[2916]: E1212 18:37:30.097707 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:30.946286 containerd[1626]: time="2025-12-12T18:37:30.946237723Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:30.946894 containerd[1626]: time="2025-12-12T18:37:30.946871627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Dec 12 18:37:30.947527 containerd[1626]: time="2025-12-12T18:37:30.947452752Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:30.948816 containerd[1626]: time="2025-12-12T18:37:30.948567276Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:30.949231 containerd[1626]: time="2025-12-12T18:37:30.949087336Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.761094052s" Dec 12 18:37:30.949231 containerd[1626]: time="2025-12-12T18:37:30.949106250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 12 18:37:30.951281 containerd[1626]: time="2025-12-12T18:37:30.951253080Z" level=info msg="CreateContainer within sandbox \"bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 18:37:30.973985 containerd[1626]: time="2025-12-12T18:37:30.973954674Z" level=info msg="Container 0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:37:31.017142 containerd[1626]: time="2025-12-12T18:37:31.017109598Z" level=info msg="CreateContainer within sandbox \"bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a\"" Dec 12 18:37:31.017639 containerd[1626]: time="2025-12-12T18:37:31.017614449Z" level=info msg="StartContainer for \"0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a\"" Dec 12 18:37:31.018582 containerd[1626]: time="2025-12-12T18:37:31.018555598Z" level=info msg="connecting to shim 0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a" address="unix:///run/containerd/s/40f4c0ee0dc36be5f4ecadca035d7b375282ce97ebefa13c7f3189ff3b37f101" protocol=ttrpc version=3 Dec 12 18:37:31.040924 systemd[1]: Started cri-containerd-0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a.scope - libcontainer container 0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a. Dec 12 18:37:31.102083 containerd[1626]: time="2025-12-12T18:37:31.102007109Z" level=info msg="StartContainer for \"0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a\" returns successfully" Dec 12 18:37:32.098529 kubelet[2916]: E1212 18:37:32.097790 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:32.459546 systemd[1]: cri-containerd-0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a.scope: Deactivated successfully. Dec 12 18:37:32.459924 systemd[1]: cri-containerd-0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a.scope: Consumed 355ms CPU time, 159M memory peak, 1.1M read from disk, 171.3M written to disk. Dec 12 18:37:32.487482 containerd[1626]: time="2025-12-12T18:37:32.487449190Z" level=info msg="received container exit event container_id:\"0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a\" id:\"0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a\" pid:3674 exited_at:{seconds:1765564652 nanos:486903682}" Dec 12 18:37:32.545734 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0d0bcc2c4e18760e28806cea1d55653162fad397edfc04885ceb4754d7e5eb6a-rootfs.mount: Deactivated successfully. Dec 12 18:37:32.554012 kubelet[2916]: I1212 18:37:32.553983 2916 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 12 18:37:32.619276 kubelet[2916]: I1212 18:37:32.618295 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/29190c43-36b5-4106-a010-9c89c155f184-calico-apiserver-certs\") pod \"calico-apiserver-654cf49445-kj5np\" (UID: \"29190c43-36b5-4106-a010-9c89c155f184\") " pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" Dec 12 18:37:32.619276 kubelet[2916]: I1212 18:37:32.618323 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7abc4147-fe9a-4147-b541-36ac776f31d2-calico-apiserver-certs\") pod \"calico-apiserver-654cf49445-47vm8\" (UID: \"7abc4147-fe9a-4147-b541-36ac776f31d2\") " pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" Dec 12 18:37:32.619276 kubelet[2916]: I1212 18:37:32.618340 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pkv\" (UniqueName: \"kubernetes.io/projected/8cb31e56-69d1-46a2-81fe-73abfd4d083c-kube-api-access-25pkv\") pod \"coredns-668d6bf9bc-gxxjb\" (UID: \"8cb31e56-69d1-46a2-81fe-73abfd4d083c\") " pod="kube-system/coredns-668d6bf9bc-gxxjb" Dec 12 18:37:32.619276 kubelet[2916]: I1212 18:37:32.618355 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dh2s\" (UniqueName: \"kubernetes.io/projected/add8767b-e2e5-42f4-972c-f607ea5079b5-kube-api-access-6dh2s\") pod \"goldmane-666569f655-pczf2\" (UID: \"add8767b-e2e5-42f4-972c-f607ea5079b5\") " pod="calico-system/goldmane-666569f655-pczf2" Dec 12 18:37:32.619276 kubelet[2916]: I1212 18:37:32.618367 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qfws\" (UniqueName: \"kubernetes.io/projected/dbd4b2e9-2587-4318-a8ac-1330292f926a-kube-api-access-9qfws\") pod \"whisker-59b5dc5f57-qbd4b\" (UID: \"dbd4b2e9-2587-4318-a8ac-1330292f926a\") " pod="calico-system/whisker-59b5dc5f57-qbd4b" Dec 12 18:37:32.619459 kubelet[2916]: I1212 18:37:32.618379 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71f4ef80-18f5-4487-91bd-c6eaf8bbba03-config-volume\") pod \"coredns-668d6bf9bc-n4q9k\" (UID: \"71f4ef80-18f5-4487-91bd-c6eaf8bbba03\") " pod="kube-system/coredns-668d6bf9bc-n4q9k" Dec 12 18:37:32.619459 kubelet[2916]: I1212 18:37:32.618388 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckrhz\" (UniqueName: \"kubernetes.io/projected/71f4ef80-18f5-4487-91bd-c6eaf8bbba03-kube-api-access-ckrhz\") pod \"coredns-668d6bf9bc-n4q9k\" (UID: \"71f4ef80-18f5-4487-91bd-c6eaf8bbba03\") " pod="kube-system/coredns-668d6bf9bc-n4q9k" Dec 12 18:37:32.619459 kubelet[2916]: I1212 18:37:32.618399 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd4b2e9-2587-4318-a8ac-1330292f926a-whisker-ca-bundle\") pod \"whisker-59b5dc5f57-qbd4b\" (UID: \"dbd4b2e9-2587-4318-a8ac-1330292f926a\") " pod="calico-system/whisker-59b5dc5f57-qbd4b" Dec 12 18:37:32.619459 kubelet[2916]: I1212 18:37:32.618409 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdl6g\" (UniqueName: \"kubernetes.io/projected/8796fb6d-23f3-4add-b202-c640f2b3e6d5-kube-api-access-rdl6g\") pod \"calico-kube-controllers-6989cc9fb7-8wtsx\" (UID: \"8796fb6d-23f3-4add-b202-c640f2b3e6d5\") " pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" Dec 12 18:37:32.619459 kubelet[2916]: I1212 18:37:32.618420 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrks7\" (UniqueName: \"kubernetes.io/projected/29190c43-36b5-4106-a010-9c89c155f184-kube-api-access-vrks7\") pod \"calico-apiserver-654cf49445-kj5np\" (UID: \"29190c43-36b5-4106-a010-9c89c155f184\") " pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" Dec 12 18:37:32.619565 kubelet[2916]: I1212 18:37:32.618433 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/add8767b-e2e5-42f4-972c-f607ea5079b5-goldmane-key-pair\") pod \"goldmane-666569f655-pczf2\" (UID: \"add8767b-e2e5-42f4-972c-f607ea5079b5\") " pod="calico-system/goldmane-666569f655-pczf2" Dec 12 18:37:32.619565 kubelet[2916]: I1212 18:37:32.618444 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/add8767b-e2e5-42f4-972c-f607ea5079b5-goldmane-ca-bundle\") pod \"goldmane-666569f655-pczf2\" (UID: \"add8767b-e2e5-42f4-972c-f607ea5079b5\") " pod="calico-system/goldmane-666569f655-pczf2" Dec 12 18:37:32.619565 kubelet[2916]: I1212 18:37:32.618456 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add8767b-e2e5-42f4-972c-f607ea5079b5-config\") pod \"goldmane-666569f655-pczf2\" (UID: \"add8767b-e2e5-42f4-972c-f607ea5079b5\") " pod="calico-system/goldmane-666569f655-pczf2" Dec 12 18:37:32.619565 kubelet[2916]: I1212 18:37:32.618493 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dbd4b2e9-2587-4318-a8ac-1330292f926a-whisker-backend-key-pair\") pod \"whisker-59b5dc5f57-qbd4b\" (UID: \"dbd4b2e9-2587-4318-a8ac-1330292f926a\") " pod="calico-system/whisker-59b5dc5f57-qbd4b" Dec 12 18:37:32.619565 kubelet[2916]: I1212 18:37:32.618519 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb31e56-69d1-46a2-81fe-73abfd4d083c-config-volume\") pod \"coredns-668d6bf9bc-gxxjb\" (UID: \"8cb31e56-69d1-46a2-81fe-73abfd4d083c\") " pod="kube-system/coredns-668d6bf9bc-gxxjb" Dec 12 18:37:32.619694 kubelet[2916]: I1212 18:37:32.618534 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8796fb6d-23f3-4add-b202-c640f2b3e6d5-tigera-ca-bundle\") pod \"calico-kube-controllers-6989cc9fb7-8wtsx\" (UID: \"8796fb6d-23f3-4add-b202-c640f2b3e6d5\") " pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" Dec 12 18:37:32.619694 kubelet[2916]: I1212 18:37:32.618543 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlz27\" (UniqueName: \"kubernetes.io/projected/7abc4147-fe9a-4147-b541-36ac776f31d2-kube-api-access-hlz27\") pod \"calico-apiserver-654cf49445-47vm8\" (UID: \"7abc4147-fe9a-4147-b541-36ac776f31d2\") " pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" Dec 12 18:37:32.622106 systemd[1]: Created slice kubepods-burstable-pod71f4ef80_18f5_4487_91bd_c6eaf8bbba03.slice - libcontainer container kubepods-burstable-pod71f4ef80_18f5_4487_91bd_c6eaf8bbba03.slice. Dec 12 18:37:32.628445 systemd[1]: Created slice kubepods-besteffort-pod8796fb6d_23f3_4add_b202_c640f2b3e6d5.slice - libcontainer container kubepods-besteffort-pod8796fb6d_23f3_4add_b202_c640f2b3e6d5.slice. Dec 12 18:37:32.637218 systemd[1]: Created slice kubepods-besteffort-pod7abc4147_fe9a_4147_b541_36ac776f31d2.slice - libcontainer container kubepods-besteffort-pod7abc4147_fe9a_4147_b541_36ac776f31d2.slice. Dec 12 18:37:32.643099 systemd[1]: Created slice kubepods-besteffort-pod29190c43_36b5_4106_a010_9c89c155f184.slice - libcontainer container kubepods-besteffort-pod29190c43_36b5_4106_a010_9c89c155f184.slice. Dec 12 18:37:32.649680 systemd[1]: Created slice kubepods-besteffort-podadd8767b_e2e5_42f4_972c_f607ea5079b5.slice - libcontainer container kubepods-besteffort-podadd8767b_e2e5_42f4_972c_f607ea5079b5.slice. Dec 12 18:37:32.656812 systemd[1]: Created slice kubepods-burstable-pod8cb31e56_69d1_46a2_81fe_73abfd4d083c.slice - libcontainer container kubepods-burstable-pod8cb31e56_69d1_46a2_81fe_73abfd4d083c.slice. Dec 12 18:37:32.661426 systemd[1]: Created slice kubepods-besteffort-poddbd4b2e9_2587_4318_a8ac_1330292f926a.slice - libcontainer container kubepods-besteffort-poddbd4b2e9_2587_4318_a8ac_1330292f926a.slice. Dec 12 18:37:32.932162 containerd[1626]: time="2025-12-12T18:37:32.932137115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-n4q9k,Uid:71f4ef80-18f5-4487-91bd-c6eaf8bbba03,Namespace:kube-system,Attempt:0,}" Dec 12 18:37:32.934149 containerd[1626]: time="2025-12-12T18:37:32.933998362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6989cc9fb7-8wtsx,Uid:8796fb6d-23f3-4add-b202-c640f2b3e6d5,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:32.950393 containerd[1626]: time="2025-12-12T18:37:32.950217720Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-654cf49445-kj5np,Uid:29190c43-36b5-4106-a010-9c89c155f184,Namespace:calico-apiserver,Attempt:0,}" Dec 12 18:37:32.950600 containerd[1626]: time="2025-12-12T18:37:32.950587932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-654cf49445-47vm8,Uid:7abc4147-fe9a-4147-b541-36ac776f31d2,Namespace:calico-apiserver,Attempt:0,}" Dec 12 18:37:32.970776 containerd[1626]: time="2025-12-12T18:37:32.970378596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59b5dc5f57-qbd4b,Uid:dbd4b2e9-2587-4318-a8ac-1330292f926a,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:32.976193 containerd[1626]: time="2025-12-12T18:37:32.976029190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-pczf2,Uid:add8767b-e2e5-42f4-972c-f607ea5079b5,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:32.976193 containerd[1626]: time="2025-12-12T18:37:32.976185023Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gxxjb,Uid:8cb31e56-69d1-46a2-81fe-73abfd4d083c,Namespace:kube-system,Attempt:0,}" Dec 12 18:37:33.293910 containerd[1626]: time="2025-12-12T18:37:33.291640016Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 18:37:33.369071 containerd[1626]: time="2025-12-12T18:37:33.368980943Z" level=error msg="Failed to destroy network for sandbox \"b2498e737b5aa641c37e9ea7b5b721454d1080f238498db61beffe0f7b69e8cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.373666 containerd[1626]: time="2025-12-12T18:37:33.373636013Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-pczf2,Uid:add8767b-e2e5-42f4-972c-f607ea5079b5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2498e737b5aa641c37e9ea7b5b721454d1080f238498db61beffe0f7b69e8cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.374142 kubelet[2916]: E1212 18:37:33.374093 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2498e737b5aa641c37e9ea7b5b721454d1080f238498db61beffe0f7b69e8cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.374761 kubelet[2916]: E1212 18:37:33.374452 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2498e737b5aa641c37e9ea7b5b721454d1080f238498db61beffe0f7b69e8cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-pczf2" Dec 12 18:37:33.374761 kubelet[2916]: E1212 18:37:33.374470 2916 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2498e737b5aa641c37e9ea7b5b721454d1080f238498db61beffe0f7b69e8cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-pczf2" Dec 12 18:37:33.374761 kubelet[2916]: E1212 18:37:33.374513 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-pczf2_calico-system(add8767b-e2e5-42f4-972c-f607ea5079b5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-pczf2_calico-system(add8767b-e2e5-42f4-972c-f607ea5079b5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2498e737b5aa641c37e9ea7b5b721454d1080f238498db61beffe0f7b69e8cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:37:33.378735 containerd[1626]: time="2025-12-12T18:37:33.378651184Z" level=error msg="Failed to destroy network for sandbox \"80b0ea7f32981ce9482254df143c36c0aa097b777432b51d7b76980a16927b05\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.383974 containerd[1626]: time="2025-12-12T18:37:33.383876724Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-n4q9k,Uid:71f4ef80-18f5-4487-91bd-c6eaf8bbba03,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"80b0ea7f32981ce9482254df143c36c0aa097b777432b51d7b76980a16927b05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.401591 containerd[1626]: time="2025-12-12T18:37:33.401407317Z" level=error msg="Failed to destroy network for sandbox \"fff2bdd89d44104a0ccca07e6db0e3467bd6fc29ed56743b8774fb2c5ade91a8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.402237 containerd[1626]: time="2025-12-12T18:37:33.402063460Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-654cf49445-47vm8,Uid:7abc4147-fe9a-4147-b541-36ac776f31d2,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fff2bdd89d44104a0ccca07e6db0e3467bd6fc29ed56743b8774fb2c5ade91a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.402237 containerd[1626]: time="2025-12-12T18:37:33.402169802Z" level=error msg="Failed to destroy network for sandbox \"ca349921caf506cdbd086f006294e1a0f57b3d1143954df662a3e0e79eeec0dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.403113 containerd[1626]: time="2025-12-12T18:37:33.402265506Z" level=error msg="Failed to destroy network for sandbox \"4f3b10a3057c0db9658e0c44dc6b033e195e48c91a921d498bfe9443226601d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.403113 containerd[1626]: time="2025-12-12T18:37:33.403003762Z" level=error msg="Failed to destroy network for sandbox \"df700dbe8b6ecfd037a81d3f8357287a7b685e5d766cd8baca0dead8e7e649b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.403157 kubelet[2916]: E1212 18:37:33.402799 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80b0ea7f32981ce9482254df143c36c0aa097b777432b51d7b76980a16927b05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.403157 kubelet[2916]: E1212 18:37:33.402832 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fff2bdd89d44104a0ccca07e6db0e3467bd6fc29ed56743b8774fb2c5ade91a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.403157 kubelet[2916]: E1212 18:37:33.402849 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80b0ea7f32981ce9482254df143c36c0aa097b777432b51d7b76980a16927b05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-n4q9k" Dec 12 18:37:33.403157 kubelet[2916]: E1212 18:37:33.402863 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fff2bdd89d44104a0ccca07e6db0e3467bd6fc29ed56743b8774fb2c5ade91a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" Dec 12 18:37:33.403275 kubelet[2916]: E1212 18:37:33.402867 2916 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80b0ea7f32981ce9482254df143c36c0aa097b777432b51d7b76980a16927b05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-n4q9k" Dec 12 18:37:33.403275 kubelet[2916]: E1212 18:37:33.402874 2916 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fff2bdd89d44104a0ccca07e6db0e3467bd6fc29ed56743b8774fb2c5ade91a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" Dec 12 18:37:33.403275 kubelet[2916]: E1212 18:37:33.402897 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-n4q9k_kube-system(71f4ef80-18f5-4487-91bd-c6eaf8bbba03)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-n4q9k_kube-system(71f4ef80-18f5-4487-91bd-c6eaf8bbba03)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"80b0ea7f32981ce9482254df143c36c0aa097b777432b51d7b76980a16927b05\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-n4q9k" podUID="71f4ef80-18f5-4487-91bd-c6eaf8bbba03" Dec 12 18:37:33.403362 kubelet[2916]: E1212 18:37:33.402909 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-654cf49445-47vm8_calico-apiserver(7abc4147-fe9a-4147-b541-36ac776f31d2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-654cf49445-47vm8_calico-apiserver(7abc4147-fe9a-4147-b541-36ac776f31d2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fff2bdd89d44104a0ccca07e6db0e3467bd6fc29ed56743b8774fb2c5ade91a8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:37:33.403706 containerd[1626]: time="2025-12-12T18:37:33.403427239Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59b5dc5f57-qbd4b,Uid:dbd4b2e9-2587-4318-a8ac-1330292f926a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca349921caf506cdbd086f006294e1a0f57b3d1143954df662a3e0e79eeec0dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.403706 containerd[1626]: time="2025-12-12T18:37:33.403505116Z" level=error msg="Failed to destroy network for sandbox \"34cdc874970759a0d528995f63c7038834e2a2bc943360006c8c75cdbedc5c49\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.404037 containerd[1626]: time="2025-12-12T18:37:33.403911002Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gxxjb,Uid:8cb31e56-69d1-46a2-81fe-73abfd4d083c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"34cdc874970759a0d528995f63c7038834e2a2bc943360006c8c75cdbedc5c49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.404351 kubelet[2916]: E1212 18:37:33.404197 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34cdc874970759a0d528995f63c7038834e2a2bc943360006c8c75cdbedc5c49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.404351 kubelet[2916]: E1212 18:37:33.404226 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34cdc874970759a0d528995f63c7038834e2a2bc943360006c8c75cdbedc5c49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gxxjb" Dec 12 18:37:33.404351 kubelet[2916]: E1212 18:37:33.404229 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca349921caf506cdbd086f006294e1a0f57b3d1143954df662a3e0e79eeec0dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.404351 kubelet[2916]: E1212 18:37:33.404239 2916 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34cdc874970759a0d528995f63c7038834e2a2bc943360006c8c75cdbedc5c49\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gxxjb" Dec 12 18:37:33.404487 kubelet[2916]: E1212 18:37:33.404263 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca349921caf506cdbd086f006294e1a0f57b3d1143954df662a3e0e79eeec0dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59b5dc5f57-qbd4b" Dec 12 18:37:33.404487 kubelet[2916]: E1212 18:37:33.404307 2916 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca349921caf506cdbd086f006294e1a0f57b3d1143954df662a3e0e79eeec0dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59b5dc5f57-qbd4b" Dec 12 18:37:33.404487 kubelet[2916]: E1212 18:37:33.404319 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-gxxjb_kube-system(8cb31e56-69d1-46a2-81fe-73abfd4d083c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-gxxjb_kube-system(8cb31e56-69d1-46a2-81fe-73abfd4d083c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"34cdc874970759a0d528995f63c7038834e2a2bc943360006c8c75cdbedc5c49\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-gxxjb" podUID="8cb31e56-69d1-46a2-81fe-73abfd4d083c" Dec 12 18:37:33.404581 kubelet[2916]: E1212 18:37:33.404333 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-59b5dc5f57-qbd4b_calico-system(dbd4b2e9-2587-4318-a8ac-1330292f926a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-59b5dc5f57-qbd4b_calico-system(dbd4b2e9-2587-4318-a8ac-1330292f926a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca349921caf506cdbd086f006294e1a0f57b3d1143954df662a3e0e79eeec0dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-59b5dc5f57-qbd4b" podUID="dbd4b2e9-2587-4318-a8ac-1330292f926a" Dec 12 18:37:33.404714 containerd[1626]: time="2025-12-12T18:37:33.404662408Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-654cf49445-kj5np,Uid:29190c43-36b5-4106-a010-9c89c155f184,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4f3b10a3057c0db9658e0c44dc6b033e195e48c91a921d498bfe9443226601d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.404926 kubelet[2916]: E1212 18:37:33.404908 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4f3b10a3057c0db9658e0c44dc6b033e195e48c91a921d498bfe9443226601d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.404977 kubelet[2916]: E1212 18:37:33.404930 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4f3b10a3057c0db9658e0c44dc6b033e195e48c91a921d498bfe9443226601d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" Dec 12 18:37:33.404977 kubelet[2916]: E1212 18:37:33.404947 2916 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4f3b10a3057c0db9658e0c44dc6b033e195e48c91a921d498bfe9443226601d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" Dec 12 18:37:33.405041 kubelet[2916]: E1212 18:37:33.404972 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-654cf49445-kj5np_calico-apiserver(29190c43-36b5-4106-a010-9c89c155f184)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-654cf49445-kj5np_calico-apiserver(29190c43-36b5-4106-a010-9c89c155f184)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4f3b10a3057c0db9658e0c44dc6b033e195e48c91a921d498bfe9443226601d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:37:33.405102 containerd[1626]: time="2025-12-12T18:37:33.405083272Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6989cc9fb7-8wtsx,Uid:8796fb6d-23f3-4add-b202-c640f2b3e6d5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"df700dbe8b6ecfd037a81d3f8357287a7b685e5d766cd8baca0dead8e7e649b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.405461 kubelet[2916]: E1212 18:37:33.405184 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df700dbe8b6ecfd037a81d3f8357287a7b685e5d766cd8baca0dead8e7e649b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:33.405564 kubelet[2916]: E1212 18:37:33.405211 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df700dbe8b6ecfd037a81d3f8357287a7b685e5d766cd8baca0dead8e7e649b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" Dec 12 18:37:33.405564 kubelet[2916]: E1212 18:37:33.405512 2916 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df700dbe8b6ecfd037a81d3f8357287a7b685e5d766cd8baca0dead8e7e649b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" Dec 12 18:37:33.405564 kubelet[2916]: E1212 18:37:33.405539 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6989cc9fb7-8wtsx_calico-system(8796fb6d-23f3-4add-b202-c640f2b3e6d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6989cc9fb7-8wtsx_calico-system(8796fb6d-23f3-4add-b202-c640f2b3e6d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"df700dbe8b6ecfd037a81d3f8357287a7b685e5d766cd8baca0dead8e7e649b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:37:34.102442 systemd[1]: Created slice kubepods-besteffort-pod8031345b_ae64_4c92_b720_42f91263ef98.slice - libcontainer container kubepods-besteffort-pod8031345b_ae64_4c92_b720_42f91263ef98.slice. Dec 12 18:37:34.104400 containerd[1626]: time="2025-12-12T18:37:34.104141157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kv4lq,Uid:8031345b-ae64-4c92-b720-42f91263ef98,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:34.147234 containerd[1626]: time="2025-12-12T18:37:34.147206485Z" level=error msg="Failed to destroy network for sandbox \"71786570ef1bc9f4e3cd9b0735356200f0e9a741f5382f910f525788fafac435\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:34.149241 containerd[1626]: time="2025-12-12T18:37:34.149203195Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kv4lq,Uid:8031345b-ae64-4c92-b720-42f91263ef98,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"71786570ef1bc9f4e3cd9b0735356200f0e9a741f5382f910f525788fafac435\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:34.149582 systemd[1]: run-netns-cni\x2dba8f84be\x2d00e4\x2d7557\x2df1b5\x2dbda16da648a2.mount: Deactivated successfully. Dec 12 18:37:34.149953 kubelet[2916]: E1212 18:37:34.149567 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71786570ef1bc9f4e3cd9b0735356200f0e9a741f5382f910f525788fafac435\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 18:37:34.149953 kubelet[2916]: E1212 18:37:34.149625 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71786570ef1bc9f4e3cd9b0735356200f0e9a741f5382f910f525788fafac435\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kv4lq" Dec 12 18:37:34.149953 kubelet[2916]: E1212 18:37:34.149652 2916 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71786570ef1bc9f4e3cd9b0735356200f0e9a741f5382f910f525788fafac435\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kv4lq" Dec 12 18:37:34.150098 kubelet[2916]: E1212 18:37:34.149720 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kv4lq_calico-system(8031345b-ae64-4c92-b720-42f91263ef98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kv4lq_calico-system(8031345b-ae64-4c92-b720-42f91263ef98)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"71786570ef1bc9f4e3cd9b0735356200f0e9a741f5382f910f525788fafac435\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:37.819932 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3368584677.mount: Deactivated successfully. Dec 12 18:37:38.265316 containerd[1626]: time="2025-12-12T18:37:38.252432365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:38.317607 containerd[1626]: time="2025-12-12T18:37:38.317576059Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Dec 12 18:37:38.333566 containerd[1626]: time="2025-12-12T18:37:38.333535997Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:38.370037 containerd[1626]: time="2025-12-12T18:37:38.370008340Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 18:37:38.371543 containerd[1626]: time="2025-12-12T18:37:38.371523598Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 5.077371573s" Dec 12 18:37:38.371543 containerd[1626]: time="2025-12-12T18:37:38.371542372Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 12 18:37:38.471357 containerd[1626]: time="2025-12-12T18:37:38.471313605Z" level=info msg="CreateContainer within sandbox \"bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 18:37:38.586491 containerd[1626]: time="2025-12-12T18:37:38.584797843Z" level=info msg="Container 42937334782139684b8c3e5add7b663ffc4a8cbb27d2ddacc17e2cce54c8bb51: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:37:38.586008 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2621943432.mount: Deactivated successfully. Dec 12 18:37:38.791648 containerd[1626]: time="2025-12-12T18:37:38.791623556Z" level=info msg="CreateContainer within sandbox \"bd249b0382926e031d76070e8d838938fdf6b7d703c13a153f812ebc1fe92972\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"42937334782139684b8c3e5add7b663ffc4a8cbb27d2ddacc17e2cce54c8bb51\"" Dec 12 18:37:38.792969 containerd[1626]: time="2025-12-12T18:37:38.792111158Z" level=info msg="StartContainer for \"42937334782139684b8c3e5add7b663ffc4a8cbb27d2ddacc17e2cce54c8bb51\"" Dec 12 18:37:38.805725 containerd[1626]: time="2025-12-12T18:37:38.805692374Z" level=info msg="connecting to shim 42937334782139684b8c3e5add7b663ffc4a8cbb27d2ddacc17e2cce54c8bb51" address="unix:///run/containerd/s/40f4c0ee0dc36be5f4ecadca035d7b375282ce97ebefa13c7f3189ff3b37f101" protocol=ttrpc version=3 Dec 12 18:37:38.943437 systemd[1]: Started cri-containerd-42937334782139684b8c3e5add7b663ffc4a8cbb27d2ddacc17e2cce54c8bb51.scope - libcontainer container 42937334782139684b8c3e5add7b663ffc4a8cbb27d2ddacc17e2cce54c8bb51. Dec 12 18:37:39.048617 containerd[1626]: time="2025-12-12T18:37:39.048579253Z" level=info msg="StartContainer for \"42937334782139684b8c3e5add7b663ffc4a8cbb27d2ddacc17e2cce54c8bb51\" returns successfully" Dec 12 18:37:39.948043 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 18:37:39.948633 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 18:37:40.260480 kubelet[2916]: I1212 18:37:40.260435 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-xnl9c" podStartSLOduration=2.464960446 podStartE2EDuration="20.260422372s" podCreationTimestamp="2025-12-12 18:37:20 +0000 UTC" firstStartedPulling="2025-12-12 18:37:20.584377135 +0000 UTC m=+18.615739807" lastFinishedPulling="2025-12-12 18:37:38.379839058 +0000 UTC m=+36.411201733" observedRunningTime="2025-12-12 18:37:39.504225019 +0000 UTC m=+37.535587702" watchObservedRunningTime="2025-12-12 18:37:40.260422372 +0000 UTC m=+38.291785049" Dec 12 18:37:40.465455 kubelet[2916]: I1212 18:37:40.465410 2916 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dbd4b2e9-2587-4318-a8ac-1330292f926a-whisker-backend-key-pair\") pod \"dbd4b2e9-2587-4318-a8ac-1330292f926a\" (UID: \"dbd4b2e9-2587-4318-a8ac-1330292f926a\") " Dec 12 18:37:40.465455 kubelet[2916]: I1212 18:37:40.465449 2916 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qfws\" (UniqueName: \"kubernetes.io/projected/dbd4b2e9-2587-4318-a8ac-1330292f926a-kube-api-access-9qfws\") pod \"dbd4b2e9-2587-4318-a8ac-1330292f926a\" (UID: \"dbd4b2e9-2587-4318-a8ac-1330292f926a\") " Dec 12 18:37:40.465455 kubelet[2916]: I1212 18:37:40.465473 2916 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd4b2e9-2587-4318-a8ac-1330292f926a-whisker-ca-bundle\") pod \"dbd4b2e9-2587-4318-a8ac-1330292f926a\" (UID: \"dbd4b2e9-2587-4318-a8ac-1330292f926a\") " Dec 12 18:37:40.465963 kubelet[2916]: I1212 18:37:40.465911 2916 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd4b2e9-2587-4318-a8ac-1330292f926a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "dbd4b2e9-2587-4318-a8ac-1330292f926a" (UID: "dbd4b2e9-2587-4318-a8ac-1330292f926a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 18:37:40.476417 systemd[1]: var-lib-kubelet-pods-dbd4b2e9\x2d2587\x2d4318\x2da8ac\x2d1330292f926a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9qfws.mount: Deactivated successfully. Dec 12 18:37:40.479957 kubelet[2916]: I1212 18:37:40.477662 2916 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd4b2e9-2587-4318-a8ac-1330292f926a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "dbd4b2e9-2587-4318-a8ac-1330292f926a" (UID: "dbd4b2e9-2587-4318-a8ac-1330292f926a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 18:37:40.479957 kubelet[2916]: I1212 18:37:40.477979 2916 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd4b2e9-2587-4318-a8ac-1330292f926a-kube-api-access-9qfws" (OuterVolumeSpecName: "kube-api-access-9qfws") pod "dbd4b2e9-2587-4318-a8ac-1330292f926a" (UID: "dbd4b2e9-2587-4318-a8ac-1330292f926a"). InnerVolumeSpecName "kube-api-access-9qfws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 18:37:40.480625 systemd[1]: var-lib-kubelet-pods-dbd4b2e9\x2d2587\x2d4318\x2da8ac\x2d1330292f926a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 18:37:40.566430 kubelet[2916]: I1212 18:37:40.565979 2916 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dbd4b2e9-2587-4318-a8ac-1330292f926a-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Dec 12 18:37:40.566430 kubelet[2916]: I1212 18:37:40.566004 2916 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9qfws\" (UniqueName: \"kubernetes.io/projected/dbd4b2e9-2587-4318-a8ac-1330292f926a-kube-api-access-9qfws\") on node \"localhost\" DevicePath \"\"" Dec 12 18:37:40.566430 kubelet[2916]: I1212 18:37:40.566014 2916 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbd4b2e9-2587-4318-a8ac-1330292f926a-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Dec 12 18:37:40.640354 systemd[1]: Removed slice kubepods-besteffort-poddbd4b2e9_2587_4318_a8ac_1330292f926a.slice - libcontainer container kubepods-besteffort-poddbd4b2e9_2587_4318_a8ac_1330292f926a.slice. Dec 12 18:37:40.737799 systemd[1]: Created slice kubepods-besteffort-pod808b905f_e397_4649_b7f1_bb0b9a44aaa4.slice - libcontainer container kubepods-besteffort-pod808b905f_e397_4649_b7f1_bb0b9a44aaa4.slice. Dec 12 18:37:40.768109 kubelet[2916]: I1212 18:37:40.768032 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/808b905f-e397-4649-b7f1-bb0b9a44aaa4-whisker-ca-bundle\") pod \"whisker-d84cf57d5-mhld9\" (UID: \"808b905f-e397-4649-b7f1-bb0b9a44aaa4\") " pod="calico-system/whisker-d84cf57d5-mhld9" Dec 12 18:37:40.768109 kubelet[2916]: I1212 18:37:40.768063 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/808b905f-e397-4649-b7f1-bb0b9a44aaa4-whisker-backend-key-pair\") pod \"whisker-d84cf57d5-mhld9\" (UID: \"808b905f-e397-4649-b7f1-bb0b9a44aaa4\") " pod="calico-system/whisker-d84cf57d5-mhld9" Dec 12 18:37:40.768109 kubelet[2916]: I1212 18:37:40.768075 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f47hz\" (UniqueName: \"kubernetes.io/projected/808b905f-e397-4649-b7f1-bb0b9a44aaa4-kube-api-access-f47hz\") pod \"whisker-d84cf57d5-mhld9\" (UID: \"808b905f-e397-4649-b7f1-bb0b9a44aaa4\") " pod="calico-system/whisker-d84cf57d5-mhld9" Dec 12 18:37:41.041447 containerd[1626]: time="2025-12-12T18:37:41.041217848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d84cf57d5-mhld9,Uid:808b905f-e397-4649-b7f1-bb0b9a44aaa4,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:41.591467 systemd-networkd[1303]: calia7c935f66a5: Link UP Dec 12 18:37:41.591608 systemd-networkd[1303]: calia7c935f66a5: Gained carrier Dec 12 18:37:41.604979 containerd[1626]: 2025-12-12 18:37:41.073 [INFO][4053] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 18:37:41.604979 containerd[1626]: 2025-12-12 18:37:41.119 [INFO][4053] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--d84cf57d5--mhld9-eth0 whisker-d84cf57d5- calico-system 808b905f-e397-4649-b7f1-bb0b9a44aaa4 873 0 2025-12-12 18:37:40 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:d84cf57d5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-d84cf57d5-mhld9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calia7c935f66a5 [] [] }} ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Namespace="calico-system" Pod="whisker-d84cf57d5-mhld9" WorkloadEndpoint="localhost-k8s-whisker--d84cf57d5--mhld9-" Dec 12 18:37:41.604979 containerd[1626]: 2025-12-12 18:37:41.119 [INFO][4053] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Namespace="calico-system" Pod="whisker-d84cf57d5-mhld9" WorkloadEndpoint="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" Dec 12 18:37:41.604979 containerd[1626]: 2025-12-12 18:37:41.532 [INFO][4065] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" HandleID="k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Workload="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.535 [INFO][4065] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" HandleID="k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Workload="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e830), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-d84cf57d5-mhld9", "timestamp":"2025-12-12 18:37:41.532802893 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.535 [INFO][4065] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.535 [INFO][4065] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.536 [INFO][4065] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.548 [INFO][4065] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" host="localhost" Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.563 [INFO][4065] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.565 [INFO][4065] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.566 [INFO][4065] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.568 [INFO][4065] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:41.611120 containerd[1626]: 2025-12-12 18:37:41.569 [INFO][4065] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" host="localhost" Dec 12 18:37:41.615588 containerd[1626]: 2025-12-12 18:37:41.569 [INFO][4065] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9 Dec 12 18:37:41.615588 containerd[1626]: 2025-12-12 18:37:41.572 [INFO][4065] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" host="localhost" Dec 12 18:37:41.615588 containerd[1626]: 2025-12-12 18:37:41.576 [INFO][4065] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" host="localhost" Dec 12 18:37:41.615588 containerd[1626]: 2025-12-12 18:37:41.576 [INFO][4065] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" host="localhost" Dec 12 18:37:41.615588 containerd[1626]: 2025-12-12 18:37:41.576 [INFO][4065] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:37:41.615588 containerd[1626]: 2025-12-12 18:37:41.577 [INFO][4065] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" HandleID="k8s-pod-network.0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Workload="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" Dec 12 18:37:41.622751 containerd[1626]: 2025-12-12 18:37:41.579 [INFO][4053] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Namespace="calico-system" Pod="whisker-d84cf57d5-mhld9" WorkloadEndpoint="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--d84cf57d5--mhld9-eth0", GenerateName:"whisker-d84cf57d5-", Namespace:"calico-system", SelfLink:"", UID:"808b905f-e397-4649-b7f1-bb0b9a44aaa4", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"d84cf57d5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-d84cf57d5-mhld9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia7c935f66a5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:41.622751 containerd[1626]: 2025-12-12 18:37:41.579 [INFO][4053] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Namespace="calico-system" Pod="whisker-d84cf57d5-mhld9" WorkloadEndpoint="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" Dec 12 18:37:41.627769 containerd[1626]: 2025-12-12 18:37:41.579 [INFO][4053] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia7c935f66a5 ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Namespace="calico-system" Pod="whisker-d84cf57d5-mhld9" WorkloadEndpoint="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" Dec 12 18:37:41.627769 containerd[1626]: 2025-12-12 18:37:41.590 [INFO][4053] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Namespace="calico-system" Pod="whisker-d84cf57d5-mhld9" WorkloadEndpoint="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" Dec 12 18:37:41.627824 containerd[1626]: 2025-12-12 18:37:41.590 [INFO][4053] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Namespace="calico-system" Pod="whisker-d84cf57d5-mhld9" WorkloadEndpoint="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--d84cf57d5--mhld9-eth0", GenerateName:"whisker-d84cf57d5-", Namespace:"calico-system", SelfLink:"", UID:"808b905f-e397-4649-b7f1-bb0b9a44aaa4", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"d84cf57d5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9", Pod:"whisker-d84cf57d5-mhld9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia7c935f66a5", MAC:"fe:3b:a5:63:04:33", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:41.627868 containerd[1626]: 2025-12-12 18:37:41.602 [INFO][4053] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" Namespace="calico-system" Pod="whisker-d84cf57d5-mhld9" WorkloadEndpoint="localhost-k8s-whisker--d84cf57d5--mhld9-eth0" Dec 12 18:37:41.751368 containerd[1626]: time="2025-12-12T18:37:41.751334343Z" level=info msg="connecting to shim 0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9" address="unix:///run/containerd/s/7f3d743ab4a1150ed4ab7cf6d2459977d3aaa53df21124b8ec3d6d1b57a1b58a" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:41.785582 systemd-networkd[1303]: vxlan.calico: Link UP Dec 12 18:37:41.785827 systemd-networkd[1303]: vxlan.calico: Gained carrier Dec 12 18:37:41.786199 systemd[1]: Started cri-containerd-0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9.scope - libcontainer container 0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9. Dec 12 18:37:41.808599 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Dec 12 18:37:41.917605 containerd[1626]: time="2025-12-12T18:37:41.917164206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d84cf57d5-mhld9,Uid:808b905f-e397-4649-b7f1-bb0b9a44aaa4,Namespace:calico-system,Attempt:0,} returns sandbox id \"0ad64fb511e18273ea4372fed90e21b352897b0891140ce72bd5b623ce4e0df9\"" Dec 12 18:37:41.936390 containerd[1626]: time="2025-12-12T18:37:41.936182490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:37:42.099768 kubelet[2916]: I1212 18:37:42.099654 2916 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd4b2e9-2587-4318-a8ac-1330292f926a" path="/var/lib/kubelet/pods/dbd4b2e9-2587-4318-a8ac-1330292f926a/volumes" Dec 12 18:37:42.400236 containerd[1626]: time="2025-12-12T18:37:42.400201497Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:42.404887 containerd[1626]: time="2025-12-12T18:37:42.402007858Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:37:42.404887 containerd[1626]: time="2025-12-12T18:37:42.402060053Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:37:42.406393 kubelet[2916]: E1212 18:37:42.402188 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:37:42.406393 kubelet[2916]: E1212 18:37:42.402236 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:37:42.420979 kubelet[2916]: E1212 18:37:42.420913 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:492d27a1f7a946e6906773b2f3082bed,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f47hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-d84cf57d5-mhld9_calico-system(808b905f-e397-4649-b7f1-bb0b9a44aaa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:42.428727 containerd[1626]: time="2025-12-12T18:37:42.428703845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:37:42.796749 containerd[1626]: time="2025-12-12T18:37:42.796707763Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:42.800639 containerd[1626]: time="2025-12-12T18:37:42.800561152Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:37:42.800639 containerd[1626]: time="2025-12-12T18:37:42.800591627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:37:42.800876 kubelet[2916]: E1212 18:37:42.800854 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:37:42.800948 kubelet[2916]: E1212 18:37:42.800937 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:37:42.801064 kubelet[2916]: E1212 18:37:42.801039 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f47hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-d84cf57d5-mhld9_calico-system(808b905f-e397-4649-b7f1-bb0b9a44aaa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:42.803154 kubelet[2916]: E1212 18:37:42.803106 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:37:43.339979 systemd-networkd[1303]: calia7c935f66a5: Gained IPv6LL Dec 12 18:37:43.348835 kubelet[2916]: E1212 18:37:43.342164 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:37:43.723972 systemd-networkd[1303]: vxlan.calico: Gained IPv6LL Dec 12 18:37:45.098580 containerd[1626]: time="2025-12-12T18:37:45.098451551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-654cf49445-47vm8,Uid:7abc4147-fe9a-4147-b541-36ac776f31d2,Namespace:calico-apiserver,Attempt:0,}" Dec 12 18:37:45.098580 containerd[1626]: time="2025-12-12T18:37:45.098487938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gxxjb,Uid:8cb31e56-69d1-46a2-81fe-73abfd4d083c,Namespace:kube-system,Attempt:0,}" Dec 12 18:37:45.231221 systemd-networkd[1303]: cali1b00f593b1b: Link UP Dec 12 18:37:45.231712 systemd-networkd[1303]: cali1b00f593b1b: Gained carrier Dec 12 18:37:45.243311 containerd[1626]: 2025-12-12 18:37:45.160 [INFO][4343] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0 calico-apiserver-654cf49445- calico-apiserver 7abc4147-fe9a-4147-b541-36ac776f31d2 804 0 2025-12-12 18:37:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:654cf49445 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-654cf49445-47vm8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1b00f593b1b [] [] }} ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-47vm8" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--47vm8-" Dec 12 18:37:45.243311 containerd[1626]: 2025-12-12 18:37:45.160 [INFO][4343] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-47vm8" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" Dec 12 18:37:45.243311 containerd[1626]: 2025-12-12 18:37:45.194 [INFO][4355] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" HandleID="k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Workload="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.194 [INFO][4355] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" HandleID="k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Workload="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-654cf49445-47vm8", "timestamp":"2025-12-12 18:37:45.194002819 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.194 [INFO][4355] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.194 [INFO][4355] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.194 [INFO][4355] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.199 [INFO][4355] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" host="localhost" Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.205 [INFO][4355] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.211 [INFO][4355] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.212 [INFO][4355] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.214 [INFO][4355] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:45.243900 containerd[1626]: 2025-12-12 18:37:45.214 [INFO][4355] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" host="localhost" Dec 12 18:37:45.244584 containerd[1626]: 2025-12-12 18:37:45.215 [INFO][4355] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553 Dec 12 18:37:45.244584 containerd[1626]: 2025-12-12 18:37:45.217 [INFO][4355] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" host="localhost" Dec 12 18:37:45.244584 containerd[1626]: 2025-12-12 18:37:45.221 [INFO][4355] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" host="localhost" Dec 12 18:37:45.244584 containerd[1626]: 2025-12-12 18:37:45.221 [INFO][4355] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" host="localhost" Dec 12 18:37:45.244584 containerd[1626]: 2025-12-12 18:37:45.221 [INFO][4355] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:37:45.244584 containerd[1626]: 2025-12-12 18:37:45.221 [INFO][4355] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" HandleID="k8s-pod-network.4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Workload="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" Dec 12 18:37:45.245655 containerd[1626]: 2025-12-12 18:37:45.226 [INFO][4343] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-47vm8" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0", GenerateName:"calico-apiserver-654cf49445-", Namespace:"calico-apiserver", SelfLink:"", UID:"7abc4147-fe9a-4147-b541-36ac776f31d2", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"654cf49445", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-654cf49445-47vm8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1b00f593b1b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:45.245722 containerd[1626]: 2025-12-12 18:37:45.226 [INFO][4343] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-47vm8" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" Dec 12 18:37:45.245722 containerd[1626]: 2025-12-12 18:37:45.226 [INFO][4343] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1b00f593b1b ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-47vm8" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" Dec 12 18:37:45.245722 containerd[1626]: 2025-12-12 18:37:45.230 [INFO][4343] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-47vm8" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" Dec 12 18:37:45.246144 containerd[1626]: 2025-12-12 18:37:45.232 [INFO][4343] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-47vm8" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0", GenerateName:"calico-apiserver-654cf49445-", Namespace:"calico-apiserver", SelfLink:"", UID:"7abc4147-fe9a-4147-b541-36ac776f31d2", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"654cf49445", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553", Pod:"calico-apiserver-654cf49445-47vm8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1b00f593b1b", MAC:"8a:f0:6c:4f:80:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:45.246228 containerd[1626]: 2025-12-12 18:37:45.239 [INFO][4343] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-47vm8" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--47vm8-eth0" Dec 12 18:37:45.257578 containerd[1626]: time="2025-12-12T18:37:45.257475269Z" level=info msg="connecting to shim 4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553" address="unix:///run/containerd/s/30302ac24e9536f3dfe63cee44fc24f07449f1a9fc2b678072e0188f3a1049ee" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:45.279894 systemd[1]: Started cri-containerd-4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553.scope - libcontainer container 4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553. Dec 12 18:37:45.290937 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Dec 12 18:37:45.326121 systemd-networkd[1303]: calie71fecb3d49: Link UP Dec 12 18:37:45.327156 systemd-networkd[1303]: calie71fecb3d49: Gained carrier Dec 12 18:37:45.346271 containerd[1626]: time="2025-12-12T18:37:45.346218433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-654cf49445-47vm8,Uid:7abc4147-fe9a-4147-b541-36ac776f31d2,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4fcd566cc6197535b6871bdb72155371ff548388bebd21e80cc839055b7c7553\"" Dec 12 18:37:45.350852 containerd[1626]: time="2025-12-12T18:37:45.350371090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:37:45.358073 containerd[1626]: 2025-12-12 18:37:45.173 [INFO][4332] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0 coredns-668d6bf9bc- kube-system 8cb31e56-69d1-46a2-81fe-73abfd4d083c 806 0 2025-12-12 18:37:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-gxxjb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie71fecb3d49 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Namespace="kube-system" Pod="coredns-668d6bf9bc-gxxjb" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--gxxjb-" Dec 12 18:37:45.358073 containerd[1626]: 2025-12-12 18:37:45.173 [INFO][4332] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Namespace="kube-system" Pod="coredns-668d6bf9bc-gxxjb" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" Dec 12 18:37:45.358073 containerd[1626]: 2025-12-12 18:37:45.209 [INFO][4360] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" HandleID="k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Workload="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.209 [INFO][4360] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" HandleID="k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Workload="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad3a0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-gxxjb", "timestamp":"2025-12-12 18:37:45.209359037 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.209 [INFO][4360] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.221 [INFO][4360] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.221 [INFO][4360] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.302 [INFO][4360] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" host="localhost" Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.306 [INFO][4360] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.309 [INFO][4360] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.310 [INFO][4360] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.311 [INFO][4360] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:45.358280 containerd[1626]: 2025-12-12 18:37:45.311 [INFO][4360] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" host="localhost" Dec 12 18:37:45.359467 containerd[1626]: 2025-12-12 18:37:45.312 [INFO][4360] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced Dec 12 18:37:45.359467 containerd[1626]: 2025-12-12 18:37:45.314 [INFO][4360] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" host="localhost" Dec 12 18:37:45.359467 containerd[1626]: 2025-12-12 18:37:45.319 [INFO][4360] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" host="localhost" Dec 12 18:37:45.359467 containerd[1626]: 2025-12-12 18:37:45.320 [INFO][4360] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" host="localhost" Dec 12 18:37:45.359467 containerd[1626]: 2025-12-12 18:37:45.320 [INFO][4360] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:37:45.359467 containerd[1626]: 2025-12-12 18:37:45.320 [INFO][4360] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" HandleID="k8s-pod-network.8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Workload="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" Dec 12 18:37:45.359709 containerd[1626]: 2025-12-12 18:37:45.323 [INFO][4332] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Namespace="kube-system" Pod="coredns-668d6bf9bc-gxxjb" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"8cb31e56-69d1-46a2-81fe-73abfd4d083c", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-gxxjb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie71fecb3d49", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:45.360307 containerd[1626]: 2025-12-12 18:37:45.323 [INFO][4332] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Namespace="kube-system" Pod="coredns-668d6bf9bc-gxxjb" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" Dec 12 18:37:45.360307 containerd[1626]: 2025-12-12 18:37:45.323 [INFO][4332] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie71fecb3d49 ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Namespace="kube-system" Pod="coredns-668d6bf9bc-gxxjb" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" Dec 12 18:37:45.360307 containerd[1626]: 2025-12-12 18:37:45.327 [INFO][4332] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Namespace="kube-system" Pod="coredns-668d6bf9bc-gxxjb" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" Dec 12 18:37:45.360416 containerd[1626]: 2025-12-12 18:37:45.327 [INFO][4332] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Namespace="kube-system" Pod="coredns-668d6bf9bc-gxxjb" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"8cb31e56-69d1-46a2-81fe-73abfd4d083c", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced", Pod:"coredns-668d6bf9bc-gxxjb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie71fecb3d49", MAC:"76:40:e6:54:ba:60", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:45.360416 containerd[1626]: 2025-12-12 18:37:45.343 [INFO][4332] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" Namespace="kube-system" Pod="coredns-668d6bf9bc-gxxjb" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--gxxjb-eth0" Dec 12 18:37:45.382873 containerd[1626]: time="2025-12-12T18:37:45.382772841Z" level=info msg="connecting to shim 8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced" address="unix:///run/containerd/s/9922027a8033037f8bb635b32b3da49760769f9cc03c6b7fc31e55ff1045a219" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:45.402979 systemd[1]: Started cri-containerd-8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced.scope - libcontainer container 8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced. Dec 12 18:37:45.415319 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Dec 12 18:37:45.444029 containerd[1626]: time="2025-12-12T18:37:45.443939070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gxxjb,Uid:8cb31e56-69d1-46a2-81fe-73abfd4d083c,Namespace:kube-system,Attempt:0,} returns sandbox id \"8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced\"" Dec 12 18:37:45.457315 containerd[1626]: time="2025-12-12T18:37:45.457232482Z" level=info msg="CreateContainer within sandbox \"8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 18:37:45.467770 containerd[1626]: time="2025-12-12T18:37:45.467745881Z" level=info msg="Container de34ab9ae5a242bb2f5d20cc389c6b341e62b7a347575d62564bc865f69576c6: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:37:45.470211 containerd[1626]: time="2025-12-12T18:37:45.470062018Z" level=info msg="CreateContainer within sandbox \"8b5d78522a0743b90638ff6b38618ea68e8033bfa3351bd4249dff54c77beced\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"de34ab9ae5a242bb2f5d20cc389c6b341e62b7a347575d62564bc865f69576c6\"" Dec 12 18:37:45.470403 containerd[1626]: time="2025-12-12T18:37:45.470393526Z" level=info msg="StartContainer for \"de34ab9ae5a242bb2f5d20cc389c6b341e62b7a347575d62564bc865f69576c6\"" Dec 12 18:37:45.470990 containerd[1626]: time="2025-12-12T18:37:45.470955133Z" level=info msg="connecting to shim de34ab9ae5a242bb2f5d20cc389c6b341e62b7a347575d62564bc865f69576c6" address="unix:///run/containerd/s/9922027a8033037f8bb635b32b3da49760769f9cc03c6b7fc31e55ff1045a219" protocol=ttrpc version=3 Dec 12 18:37:45.489947 systemd[1]: Started cri-containerd-de34ab9ae5a242bb2f5d20cc389c6b341e62b7a347575d62564bc865f69576c6.scope - libcontainer container de34ab9ae5a242bb2f5d20cc389c6b341e62b7a347575d62564bc865f69576c6. Dec 12 18:37:45.524808 containerd[1626]: time="2025-12-12T18:37:45.524741466Z" level=info msg="StartContainer for \"de34ab9ae5a242bb2f5d20cc389c6b341e62b7a347575d62564bc865f69576c6\" returns successfully" Dec 12 18:37:45.669975 containerd[1626]: time="2025-12-12T18:37:45.669806471Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:45.673072 containerd[1626]: time="2025-12-12T18:37:45.673047987Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:37:45.673158 containerd[1626]: time="2025-12-12T18:37:45.673103944Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:37:45.673269 kubelet[2916]: E1212 18:37:45.673230 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:37:45.673463 kubelet[2916]: E1212 18:37:45.673265 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:37:45.673463 kubelet[2916]: E1212 18:37:45.673350 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hlz27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-654cf49445-47vm8_calico-apiserver(7abc4147-fe9a-4147-b541-36ac776f31d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:45.674631 kubelet[2916]: E1212 18:37:45.674607 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:37:46.098261 containerd[1626]: time="2025-12-12T18:37:46.098072667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-pczf2,Uid:add8767b-e2e5-42f4-972c-f607ea5079b5,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:46.169514 systemd-networkd[1303]: calid879ccbb62c: Link UP Dec 12 18:37:46.169965 systemd-networkd[1303]: calid879ccbb62c: Gained carrier Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.124 [INFO][4512] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--666569f655--pczf2-eth0 goldmane-666569f655- calico-system add8767b-e2e5-42f4-972c-f607ea5079b5 803 0 2025-12-12 18:37:18 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-666569f655-pczf2 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid879ccbb62c [] [] }} ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Namespace="calico-system" Pod="goldmane-666569f655-pczf2" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--pczf2-" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.124 [INFO][4512] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Namespace="calico-system" Pod="goldmane-666569f655-pczf2" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--pczf2-eth0" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.144 [INFO][4524] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" HandleID="k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Workload="localhost-k8s-goldmane--666569f655--pczf2-eth0" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.144 [INFO][4524] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" HandleID="k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Workload="localhost-k8s-goldmane--666569f655--pczf2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-666569f655-pczf2", "timestamp":"2025-12-12 18:37:46.144532307 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.144 [INFO][4524] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.144 [INFO][4524] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.144 [INFO][4524] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.149 [INFO][4524] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.151 [INFO][4524] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.153 [INFO][4524] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.157 [INFO][4524] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.158 [INFO][4524] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.158 [INFO][4524] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.159 [INFO][4524] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.160 [INFO][4524] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.164 [INFO][4524] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.165 [INFO][4524] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" host="localhost" Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.165 [INFO][4524] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:37:46.179476 containerd[1626]: 2025-12-12 18:37:46.165 [INFO][4524] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" HandleID="k8s-pod-network.0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Workload="localhost-k8s-goldmane--666569f655--pczf2-eth0" Dec 12 18:37:46.185544 containerd[1626]: 2025-12-12 18:37:46.166 [INFO][4512] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Namespace="calico-system" Pod="goldmane-666569f655-pczf2" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--pczf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--pczf2-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"add8767b-e2e5-42f4-972c-f607ea5079b5", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-666569f655-pczf2", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid879ccbb62c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:46.185544 containerd[1626]: 2025-12-12 18:37:46.166 [INFO][4512] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Namespace="calico-system" Pod="goldmane-666569f655-pczf2" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--pczf2-eth0" Dec 12 18:37:46.185544 containerd[1626]: 2025-12-12 18:37:46.166 [INFO][4512] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid879ccbb62c ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Namespace="calico-system" Pod="goldmane-666569f655-pczf2" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--pczf2-eth0" Dec 12 18:37:46.185544 containerd[1626]: 2025-12-12 18:37:46.170 [INFO][4512] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Namespace="calico-system" Pod="goldmane-666569f655-pczf2" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--pczf2-eth0" Dec 12 18:37:46.185544 containerd[1626]: 2025-12-12 18:37:46.170 [INFO][4512] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Namespace="calico-system" Pod="goldmane-666569f655-pczf2" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--pczf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--pczf2-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"add8767b-e2e5-42f4-972c-f607ea5079b5", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d", Pod:"goldmane-666569f655-pczf2", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid879ccbb62c", MAC:"d2:a9:af:3b:4c:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:46.185544 containerd[1626]: 2025-12-12 18:37:46.177 [INFO][4512] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" Namespace="calico-system" Pod="goldmane-666569f655-pczf2" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--pczf2-eth0" Dec 12 18:37:46.202571 containerd[1626]: time="2025-12-12T18:37:46.202292867Z" level=info msg="connecting to shim 0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d" address="unix:///run/containerd/s/65242a2537eae7d974149bc6dc6dac254a97a8da51e0a08a3352d40917d36735" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:46.227946 systemd[1]: Started cri-containerd-0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d.scope - libcontainer container 0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d. Dec 12 18:37:46.236416 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Dec 12 18:37:46.259854 containerd[1626]: time="2025-12-12T18:37:46.259828821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-pczf2,Uid:add8767b-e2e5-42f4-972c-f607ea5079b5,Namespace:calico-system,Attempt:0,} returns sandbox id \"0f259f5bc9897d54e7075bf20b5adb5596cabebc766d05a673a707d5ecc84b5d\"" Dec 12 18:37:46.262300 containerd[1626]: time="2025-12-12T18:37:46.260917930Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 18:37:46.353685 kubelet[2916]: E1212 18:37:46.353619 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:37:46.386640 kubelet[2916]: I1212 18:37:46.386593 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-gxxjb" podStartSLOduration=39.382103944 podStartE2EDuration="39.382103944s" podCreationTimestamp="2025-12-12 18:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:37:46.369986567 +0000 UTC m=+44.401349249" watchObservedRunningTime="2025-12-12 18:37:46.382103944 +0000 UTC m=+44.413466619" Dec 12 18:37:46.649631 containerd[1626]: time="2025-12-12T18:37:46.649554088Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:46.650025 containerd[1626]: time="2025-12-12T18:37:46.650003847Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 18:37:46.650071 containerd[1626]: time="2025-12-12T18:37:46.650058853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 18:37:46.650180 kubelet[2916]: E1212 18:37:46.650156 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:37:46.650219 kubelet[2916]: E1212 18:37:46.650186 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:37:46.650292 kubelet[2916]: E1212 18:37:46.650270 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6dh2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-pczf2_calico-system(add8767b-e2e5-42f4-972c-f607ea5079b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:46.651491 kubelet[2916]: E1212 18:37:46.651462 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:37:46.860170 systemd-networkd[1303]: cali1b00f593b1b: Gained IPv6LL Dec 12 18:37:47.051969 systemd-networkd[1303]: calie71fecb3d49: Gained IPv6LL Dec 12 18:37:47.098335 containerd[1626]: time="2025-12-12T18:37:47.098301988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kv4lq,Uid:8031345b-ae64-4c92-b720-42f91263ef98,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:47.098598 containerd[1626]: time="2025-12-12T18:37:47.098580040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-654cf49445-kj5np,Uid:29190c43-36b5-4106-a010-9c89c155f184,Namespace:calico-apiserver,Attempt:0,}" Dec 12 18:37:47.098713 containerd[1626]: time="2025-12-12T18:37:47.098697365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6989cc9fb7-8wtsx,Uid:8796fb6d-23f3-4add-b202-c640f2b3e6d5,Namespace:calico-system,Attempt:0,}" Dec 12 18:37:47.244396 systemd-networkd[1303]: cali12080807747: Link UP Dec 12 18:37:47.245083 systemd-networkd[1303]: cali12080807747: Gained carrier Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.139 [INFO][4592] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0 calico-apiserver-654cf49445- calico-apiserver 29190c43-36b5-4106-a010-9c89c155f184 805 0 2025-12-12 18:37:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:654cf49445 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-654cf49445-kj5np eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali12080807747 [] [] }} ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-kj5np" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--kj5np-" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.139 [INFO][4592] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-kj5np" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.180 [INFO][4629] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" HandleID="k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Workload="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.180 [INFO][4629] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" HandleID="k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Workload="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c35b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-654cf49445-kj5np", "timestamp":"2025-12-12 18:37:47.180597491 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.180 [INFO][4629] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.180 [INFO][4629] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.180 [INFO][4629] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.187 [INFO][4629] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.190 [INFO][4629] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.195 [INFO][4629] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.197 [INFO][4629] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.199 [INFO][4629] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.199 [INFO][4629] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.200 [INFO][4629] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143 Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.218 [INFO][4629] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.233 [INFO][4629] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.233 [INFO][4629] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" host="localhost" Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.233 [INFO][4629] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:37:47.284290 containerd[1626]: 2025-12-12 18:37:47.233 [INFO][4629] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" HandleID="k8s-pod-network.45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Workload="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" Dec 12 18:37:47.284950 containerd[1626]: 2025-12-12 18:37:47.236 [INFO][4592] cni-plugin/k8s.go 418: Populated endpoint ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-kj5np" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0", GenerateName:"calico-apiserver-654cf49445-", Namespace:"calico-apiserver", SelfLink:"", UID:"29190c43-36b5-4106-a010-9c89c155f184", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"654cf49445", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-654cf49445-kj5np", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali12080807747", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:47.284950 containerd[1626]: 2025-12-12 18:37:47.236 [INFO][4592] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-kj5np" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" Dec 12 18:37:47.284950 containerd[1626]: 2025-12-12 18:37:47.236 [INFO][4592] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali12080807747 ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-kj5np" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" Dec 12 18:37:47.284950 containerd[1626]: 2025-12-12 18:37:47.244 [INFO][4592] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-kj5np" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" Dec 12 18:37:47.284950 containerd[1626]: 2025-12-12 18:37:47.246 [INFO][4592] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-kj5np" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0", GenerateName:"calico-apiserver-654cf49445-", Namespace:"calico-apiserver", SelfLink:"", UID:"29190c43-36b5-4106-a010-9c89c155f184", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"654cf49445", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143", Pod:"calico-apiserver-654cf49445-kj5np", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali12080807747", MAC:"ea:c5:d2:25:47:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:47.284950 containerd[1626]: 2025-12-12 18:37:47.282 [INFO][4592] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" Namespace="calico-apiserver" Pod="calico-apiserver-654cf49445-kj5np" WorkloadEndpoint="localhost-k8s-calico--apiserver--654cf49445--kj5np-eth0" Dec 12 18:37:47.423925 systemd-networkd[1303]: calic92ee9bf7c3: Link UP Dec 12 18:37:47.425041 systemd-networkd[1303]: calic92ee9bf7c3: Gained carrier Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.157 [INFO][4611] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0 calico-kube-controllers-6989cc9fb7- calico-system 8796fb6d-23f3-4add-b202-c640f2b3e6d5 800 0 2025-12-12 18:37:20 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6989cc9fb7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6989cc9fb7-8wtsx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic92ee9bf7c3 [] [] }} ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Namespace="calico-system" Pod="calico-kube-controllers-6989cc9fb7-8wtsx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.157 [INFO][4611] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Namespace="calico-system" Pod="calico-kube-controllers-6989cc9fb7-8wtsx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.185 [INFO][4637] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" HandleID="k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Workload="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.185 [INFO][4637] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" HandleID="k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Workload="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5660), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6989cc9fb7-8wtsx", "timestamp":"2025-12-12 18:37:47.185037493 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.185 [INFO][4637] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.233 [INFO][4637] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.233 [INFO][4637] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.287 [INFO][4637] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.291 [INFO][4637] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.294 [INFO][4637] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.327 [INFO][4637] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.328 [INFO][4637] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.328 [INFO][4637] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.329 [INFO][4637] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.370 [INFO][4637] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.409 [INFO][4637] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.409 [INFO][4637] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" host="localhost" Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.409 [INFO][4637] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:37:47.465127 containerd[1626]: 2025-12-12 18:37:47.409 [INFO][4637] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" HandleID="k8s-pod-network.d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Workload="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" Dec 12 18:37:47.473212 containerd[1626]: 2025-12-12 18:37:47.412 [INFO][4611] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Namespace="calico-system" Pod="calico-kube-controllers-6989cc9fb7-8wtsx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0", GenerateName:"calico-kube-controllers-6989cc9fb7-", Namespace:"calico-system", SelfLink:"", UID:"8796fb6d-23f3-4add-b202-c640f2b3e6d5", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6989cc9fb7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6989cc9fb7-8wtsx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic92ee9bf7c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:47.473212 containerd[1626]: 2025-12-12 18:37:47.413 [INFO][4611] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Namespace="calico-system" Pod="calico-kube-controllers-6989cc9fb7-8wtsx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" Dec 12 18:37:47.473212 containerd[1626]: 2025-12-12 18:37:47.413 [INFO][4611] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic92ee9bf7c3 ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Namespace="calico-system" Pod="calico-kube-controllers-6989cc9fb7-8wtsx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" Dec 12 18:37:47.473212 containerd[1626]: 2025-12-12 18:37:47.425 [INFO][4611] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Namespace="calico-system" Pod="calico-kube-controllers-6989cc9fb7-8wtsx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" Dec 12 18:37:47.473212 containerd[1626]: 2025-12-12 18:37:47.431 [INFO][4611] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Namespace="calico-system" Pod="calico-kube-controllers-6989cc9fb7-8wtsx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0", GenerateName:"calico-kube-controllers-6989cc9fb7-", Namespace:"calico-system", SelfLink:"", UID:"8796fb6d-23f3-4add-b202-c640f2b3e6d5", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6989cc9fb7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe", Pod:"calico-kube-controllers-6989cc9fb7-8wtsx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic92ee9bf7c3", MAC:"7a:fb:00:dc:0b:f9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:47.473212 containerd[1626]: 2025-12-12 18:37:47.462 [INFO][4611] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" Namespace="calico-system" Pod="calico-kube-controllers-6989cc9fb7-8wtsx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6989cc9fb7--8wtsx-eth0" Dec 12 18:37:47.499798 kubelet[2916]: E1212 18:37:47.498332 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:37:47.501967 kubelet[2916]: E1212 18:37:47.500727 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:37:47.503159 containerd[1626]: time="2025-12-12T18:37:47.503129292Z" level=info msg="connecting to shim 45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143" address="unix:///run/containerd/s/12ddb63fbc0fbaedc90f0b59d5ff8c671600c5e5b496e286314e701b453a6a45" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:47.529914 systemd[1]: Started cri-containerd-45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143.scope - libcontainer container 45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143. Dec 12 18:37:47.546974 systemd-networkd[1303]: cali8cbcc08b03c: Link UP Dec 12 18:37:47.549502 systemd-networkd[1303]: cali8cbcc08b03c: Gained carrier Dec 12 18:37:47.560257 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Dec 12 18:37:47.563903 systemd-networkd[1303]: calid879ccbb62c: Gained IPv6LL Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.158 [INFO][4593] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--kv4lq-eth0 csi-node-driver- calico-system 8031345b-ae64-4c92-b720-42f91263ef98 685 0 2025-12-12 18:37:20 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-kv4lq eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8cbcc08b03c [] [] }} ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Namespace="calico-system" Pod="csi-node-driver-kv4lq" WorkloadEndpoint="localhost-k8s-csi--node--driver--kv4lq-" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.158 [INFO][4593] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Namespace="calico-system" Pod="csi-node-driver-kv4lq" WorkloadEndpoint="localhost-k8s-csi--node--driver--kv4lq-eth0" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.197 [INFO][4642] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" HandleID="k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Workload="localhost-k8s-csi--node--driver--kv4lq-eth0" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.198 [INFO][4642] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" HandleID="k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Workload="localhost-k8s-csi--node--driver--kv4lq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5980), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-kv4lq", "timestamp":"2025-12-12 18:37:47.197467857 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.199 [INFO][4642] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.410 [INFO][4642] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.410 [INFO][4642] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.420 [INFO][4642] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.466 [INFO][4642] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.471 [INFO][4642] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.474 [INFO][4642] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.489 [INFO][4642] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.489 [INFO][4642] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.491 [INFO][4642] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.502 [INFO][4642] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.534 [INFO][4642] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.534 [INFO][4642] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" host="localhost" Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.534 [INFO][4642] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:37:47.602267 containerd[1626]: 2025-12-12 18:37:47.534 [INFO][4642] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" HandleID="k8s-pod-network.c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Workload="localhost-k8s-csi--node--driver--kv4lq-eth0" Dec 12 18:37:47.612111 containerd[1626]: 2025-12-12 18:37:47.538 [INFO][4593] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Namespace="calico-system" Pod="csi-node-driver-kv4lq" WorkloadEndpoint="localhost-k8s-csi--node--driver--kv4lq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--kv4lq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8031345b-ae64-4c92-b720-42f91263ef98", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-kv4lq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8cbcc08b03c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:47.612111 containerd[1626]: 2025-12-12 18:37:47.538 [INFO][4593] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Namespace="calico-system" Pod="csi-node-driver-kv4lq" WorkloadEndpoint="localhost-k8s-csi--node--driver--kv4lq-eth0" Dec 12 18:37:47.612111 containerd[1626]: 2025-12-12 18:37:47.538 [INFO][4593] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8cbcc08b03c ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Namespace="calico-system" Pod="csi-node-driver-kv4lq" WorkloadEndpoint="localhost-k8s-csi--node--driver--kv4lq-eth0" Dec 12 18:37:47.612111 containerd[1626]: 2025-12-12 18:37:47.551 [INFO][4593] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Namespace="calico-system" Pod="csi-node-driver-kv4lq" WorkloadEndpoint="localhost-k8s-csi--node--driver--kv4lq-eth0" Dec 12 18:37:47.612111 containerd[1626]: 2025-12-12 18:37:47.551 [INFO][4593] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Namespace="calico-system" Pod="csi-node-driver-kv4lq" WorkloadEndpoint="localhost-k8s-csi--node--driver--kv4lq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--kv4lq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8031345b-ae64-4c92-b720-42f91263ef98", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b", Pod:"csi-node-driver-kv4lq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8cbcc08b03c", MAC:"e2:a8:f6:00:e6:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:47.612111 containerd[1626]: 2025-12-12 18:37:47.600 [INFO][4593] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" Namespace="calico-system" Pod="csi-node-driver-kv4lq" WorkloadEndpoint="localhost-k8s-csi--node--driver--kv4lq-eth0" Dec 12 18:37:47.617425 containerd[1626]: time="2025-12-12T18:37:47.617389453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-654cf49445-kj5np,Uid:29190c43-36b5-4106-a010-9c89c155f184,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"45697cede028c68fc7021b83ada9e0d4c8d0e9757e4a1a17fb1c274d0cbf4143\"" Dec 12 18:37:47.618420 containerd[1626]: time="2025-12-12T18:37:47.618404200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:37:47.669346 containerd[1626]: time="2025-12-12T18:37:47.669279586Z" level=info msg="connecting to shim d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe" address="unix:///run/containerd/s/efce55bb01050b7cb2375e9224e01016410093a63baafd226dcbcad3176dc7ab" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:47.691921 systemd[1]: Started cri-containerd-d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe.scope - libcontainer container d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe. Dec 12 18:37:47.702693 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Dec 12 18:37:47.739249 containerd[1626]: time="2025-12-12T18:37:47.739222101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6989cc9fb7-8wtsx,Uid:8796fb6d-23f3-4add-b202-c640f2b3e6d5,Namespace:calico-system,Attempt:0,} returns sandbox id \"d074655f3e939f9648756eea864402e78d76b35837a94551cf5b5e2c17e7cebe\"" Dec 12 18:37:47.797424 containerd[1626]: time="2025-12-12T18:37:47.797017003Z" level=info msg="connecting to shim c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b" address="unix:///run/containerd/s/cb56b8cd60afe9698ac3f57b656a954f3673407b3cf724dcba533d404e5cd1a6" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:47.822848 systemd[1]: Started cri-containerd-c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b.scope - libcontainer container c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b. Dec 12 18:37:47.834954 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Dec 12 18:37:47.847144 containerd[1626]: time="2025-12-12T18:37:47.847112919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kv4lq,Uid:8031345b-ae64-4c92-b720-42f91263ef98,Namespace:calico-system,Attempt:0,} returns sandbox id \"c8adf835aa1ec15eb6be5338d7c2ebaa56642a1c096c6199874a3b070245204b\"" Dec 12 18:37:48.083665 containerd[1626]: time="2025-12-12T18:37:48.083500656Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:48.093236 containerd[1626]: time="2025-12-12T18:37:48.093173660Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:37:48.093236 containerd[1626]: time="2025-12-12T18:37:48.093218639Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:37:48.093337 kubelet[2916]: E1212 18:37:48.093308 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:37:48.093381 kubelet[2916]: E1212 18:37:48.093343 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:37:48.093645 containerd[1626]: time="2025-12-12T18:37:48.093629908Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 18:37:48.098931 containerd[1626]: time="2025-12-12T18:37:48.098710354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-n4q9k,Uid:71f4ef80-18f5-4487-91bd-c6eaf8bbba03,Namespace:kube-system,Attempt:0,}" Dec 12 18:37:48.131336 kubelet[2916]: E1212 18:37:48.131272 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vrks7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-654cf49445-kj5np_calico-apiserver(29190c43-36b5-4106-a010-9c89c155f184): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:48.132463 kubelet[2916]: E1212 18:37:48.132439 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:37:48.226699 systemd-networkd[1303]: cali17796a5997b: Link UP Dec 12 18:37:48.228065 systemd-networkd[1303]: cali17796a5997b: Gained carrier Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.153 [INFO][4826] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0 coredns-668d6bf9bc- kube-system 71f4ef80-18f5-4487-91bd-c6eaf8bbba03 794 0 2025-12-12 18:37:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-n4q9k eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali17796a5997b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Namespace="kube-system" Pod="coredns-668d6bf9bc-n4q9k" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--n4q9k-" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.153 [INFO][4826] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Namespace="kube-system" Pod="coredns-668d6bf9bc-n4q9k" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.170 [INFO][4837] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" HandleID="k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Workload="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.170 [INFO][4837] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" HandleID="k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Workload="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f590), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-n4q9k", "timestamp":"2025-12-12 18:37:48.170067729 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.170 [INFO][4837] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.170 [INFO][4837] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.170 [INFO][4837] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.177 [INFO][4837] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.179 [INFO][4837] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.181 [INFO][4837] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.182 [INFO][4837] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.185 [INFO][4837] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.185 [INFO][4837] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.186 [INFO][4837] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359 Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.195 [INFO][4837] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.220 [INFO][4837] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.220 [INFO][4837] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" host="localhost" Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.220 [INFO][4837] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 18:37:48.255963 containerd[1626]: 2025-12-12 18:37:48.220 [INFO][4837] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" HandleID="k8s-pod-network.29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Workload="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" Dec 12 18:37:48.265647 containerd[1626]: 2025-12-12 18:37:48.222 [INFO][4826] cni-plugin/k8s.go 418: Populated endpoint ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Namespace="kube-system" Pod="coredns-668d6bf9bc-n4q9k" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"71f4ef80-18f5-4487-91bd-c6eaf8bbba03", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-n4q9k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali17796a5997b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:48.265647 containerd[1626]: 2025-12-12 18:37:48.222 [INFO][4826] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Namespace="kube-system" Pod="coredns-668d6bf9bc-n4q9k" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" Dec 12 18:37:48.265647 containerd[1626]: 2025-12-12 18:37:48.222 [INFO][4826] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali17796a5997b ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Namespace="kube-system" Pod="coredns-668d6bf9bc-n4q9k" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" Dec 12 18:37:48.265647 containerd[1626]: 2025-12-12 18:37:48.228 [INFO][4826] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Namespace="kube-system" Pod="coredns-668d6bf9bc-n4q9k" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" Dec 12 18:37:48.265647 containerd[1626]: 2025-12-12 18:37:48.228 [INFO][4826] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Namespace="kube-system" Pod="coredns-668d6bf9bc-n4q9k" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"71f4ef80-18f5-4487-91bd-c6eaf8bbba03", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 18, 37, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359", Pod:"coredns-668d6bf9bc-n4q9k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali17796a5997b", MAC:"fa:38:06:2b:69:c6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 18:37:48.265647 containerd[1626]: 2025-12-12 18:37:48.251 [INFO][4826] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" Namespace="kube-system" Pod="coredns-668d6bf9bc-n4q9k" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--n4q9k-eth0" Dec 12 18:37:48.395906 systemd-networkd[1303]: cali12080807747: Gained IPv6LL Dec 12 18:37:48.427905 containerd[1626]: time="2025-12-12T18:37:48.427870391Z" level=info msg="connecting to shim 29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359" address="unix:///run/containerd/s/38494eb96fe89baf7743dc313d773a055c5fbfb62871571e5548124f71bea9b7" namespace=k8s.io protocol=ttrpc version=3 Dec 12 18:37:48.450948 systemd[1]: Started cri-containerd-29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359.scope - libcontainer container 29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359. Dec 12 18:37:48.464908 containerd[1626]: time="2025-12-12T18:37:48.464882798Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:48.465830 systemd-resolved[1543]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Dec 12 18:37:48.467427 containerd[1626]: time="2025-12-12T18:37:48.467401380Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 18:37:48.467471 containerd[1626]: time="2025-12-12T18:37:48.467457345Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 18:37:48.467768 kubelet[2916]: E1212 18:37:48.467650 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:37:48.467994 kubelet[2916]: E1212 18:37:48.467864 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:37:48.468334 containerd[1626]: time="2025-12-12T18:37:48.468118787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 18:37:48.468548 kubelet[2916]: E1212 18:37:48.468500 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdl6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6989cc9fb7-8wtsx_calico-system(8796fb6d-23f3-4add-b202-c640f2b3e6d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:48.469803 kubelet[2916]: E1212 18:37:48.469755 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:37:48.498568 containerd[1626]: time="2025-12-12T18:37:48.498483335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-n4q9k,Uid:71f4ef80-18f5-4487-91bd-c6eaf8bbba03,Namespace:kube-system,Attempt:0,} returns sandbox id \"29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359\"" Dec 12 18:37:48.501188 kubelet[2916]: E1212 18:37:48.501140 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:37:48.502345 containerd[1626]: time="2025-12-12T18:37:48.502246523Z" level=info msg="CreateContainer within sandbox \"29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 18:37:48.506324 kubelet[2916]: E1212 18:37:48.506292 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:37:48.518682 containerd[1626]: time="2025-12-12T18:37:48.518659869Z" level=info msg="Container 111e3c89dd8e98e360d64c2b966004f5047fe6e2462e9bf88a16a6e7e5d7301a: CDI devices from CRI Config.CDIDevices: []" Dec 12 18:37:48.524333 containerd[1626]: time="2025-12-12T18:37:48.524309684Z" level=info msg="CreateContainer within sandbox \"29864894e77157cab5fcd9f63c064003cb2194cfb1bc8d9a2550448070735359\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"111e3c89dd8e98e360d64c2b966004f5047fe6e2462e9bf88a16a6e7e5d7301a\"" Dec 12 18:37:48.526272 containerd[1626]: time="2025-12-12T18:37:48.526252400Z" level=info msg="StartContainer for \"111e3c89dd8e98e360d64c2b966004f5047fe6e2462e9bf88a16a6e7e5d7301a\"" Dec 12 18:37:48.527503 containerd[1626]: time="2025-12-12T18:37:48.527423856Z" level=info msg="connecting to shim 111e3c89dd8e98e360d64c2b966004f5047fe6e2462e9bf88a16a6e7e5d7301a" address="unix:///run/containerd/s/38494eb96fe89baf7743dc313d773a055c5fbfb62871571e5548124f71bea9b7" protocol=ttrpc version=3 Dec 12 18:37:48.542954 systemd[1]: Started cri-containerd-111e3c89dd8e98e360d64c2b966004f5047fe6e2462e9bf88a16a6e7e5d7301a.scope - libcontainer container 111e3c89dd8e98e360d64c2b966004f5047fe6e2462e9bf88a16a6e7e5d7301a. Dec 12 18:37:48.571832 containerd[1626]: time="2025-12-12T18:37:48.571795265Z" level=info msg="StartContainer for \"111e3c89dd8e98e360d64c2b966004f5047fe6e2462e9bf88a16a6e7e5d7301a\" returns successfully" Dec 12 18:37:48.798597 containerd[1626]: time="2025-12-12T18:37:48.798456032Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:48.803967 containerd[1626]: time="2025-12-12T18:37:48.803939260Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 18:37:48.804111 containerd[1626]: time="2025-12-12T18:37:48.804037601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 18:37:48.804280 kubelet[2916]: E1212 18:37:48.804246 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:37:48.804327 kubelet[2916]: E1212 18:37:48.804290 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:37:48.804425 kubelet[2916]: E1212 18:37:48.804378 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhlst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kv4lq_calico-system(8031345b-ae64-4c92-b720-42f91263ef98): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:48.806176 containerd[1626]: time="2025-12-12T18:37:48.806139178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 18:37:49.099940 systemd-networkd[1303]: calic92ee9bf7c3: Gained IPv6LL Dec 12 18:37:49.149692 containerd[1626]: time="2025-12-12T18:37:49.149491178Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:49.160483 containerd[1626]: time="2025-12-12T18:37:49.160458433Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 18:37:49.160611 containerd[1626]: time="2025-12-12T18:37:49.160520979Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 18:37:49.161290 kubelet[2916]: E1212 18:37:49.160720 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:37:49.161290 kubelet[2916]: E1212 18:37:49.160750 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:37:49.161992 kubelet[2916]: E1212 18:37:49.161421 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhlst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kv4lq_calico-system(8031345b-ae64-4c92-b720-42f91263ef98): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:49.238533 kubelet[2916]: E1212 18:37:49.238497 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:49.509187 kubelet[2916]: E1212 18:37:49.509128 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:37:49.510006 kubelet[2916]: E1212 18:37:49.509440 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:37:49.510006 kubelet[2916]: E1212 18:37:49.509870 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:37:49.547879 systemd-networkd[1303]: cali8cbcc08b03c: Gained IPv6LL Dec 12 18:37:49.720269 kubelet[2916]: I1212 18:37:49.720211 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-n4q9k" podStartSLOduration=42.720195831 podStartE2EDuration="42.720195831s" podCreationTimestamp="2025-12-12 18:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 18:37:49.630309929 +0000 UTC m=+47.661672611" watchObservedRunningTime="2025-12-12 18:37:49.720195831 +0000 UTC m=+47.751558501" Dec 12 18:37:49.867912 systemd-networkd[1303]: cali17796a5997b: Gained IPv6LL Dec 12 18:37:57.098851 containerd[1626]: time="2025-12-12T18:37:57.098752947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:37:57.432907 containerd[1626]: time="2025-12-12T18:37:57.432747414Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:57.446923 containerd[1626]: time="2025-12-12T18:37:57.446879975Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:37:57.447040 containerd[1626]: time="2025-12-12T18:37:57.446950496Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:37:57.447068 kubelet[2916]: E1212 18:37:57.447039 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:37:57.447265 kubelet[2916]: E1212 18:37:57.447071 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:37:57.447265 kubelet[2916]: E1212 18:37:57.447144 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:492d27a1f7a946e6906773b2f3082bed,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f47hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-d84cf57d5-mhld9_calico-system(808b905f-e397-4649-b7f1-bb0b9a44aaa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:57.449033 containerd[1626]: time="2025-12-12T18:37:57.449014968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:37:57.836135 containerd[1626]: time="2025-12-12T18:37:57.836061354Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:57.836547 containerd[1626]: time="2025-12-12T18:37:57.836470733Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:37:57.836547 containerd[1626]: time="2025-12-12T18:37:57.836529114Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:37:57.836724 kubelet[2916]: E1212 18:37:57.836697 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:37:57.836827 kubelet[2916]: E1212 18:37:57.836813 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:37:57.836957 kubelet[2916]: E1212 18:37:57.836933 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f47hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-d84cf57d5-mhld9_calico-system(808b905f-e397-4649-b7f1-bb0b9a44aaa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:57.838183 kubelet[2916]: E1212 18:37:57.838143 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:37:58.100406 containerd[1626]: time="2025-12-12T18:37:58.100331257Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:37:58.494167 containerd[1626]: time="2025-12-12T18:37:58.494070451Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:37:58.494641 containerd[1626]: time="2025-12-12T18:37:58.494582405Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:37:58.494688 containerd[1626]: time="2025-12-12T18:37:58.494595640Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:37:58.494902 kubelet[2916]: E1212 18:37:58.494875 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:37:58.495313 kubelet[2916]: E1212 18:37:58.494907 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:37:58.495313 kubelet[2916]: E1212 18:37:58.494995 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hlz27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-654cf49445-47vm8_calico-apiserver(7abc4147-fe9a-4147-b541-36ac776f31d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:37:58.496586 kubelet[2916]: E1212 18:37:58.496551 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:38:00.099291 containerd[1626]: time="2025-12-12T18:38:00.099197742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 18:38:00.464665 containerd[1626]: time="2025-12-12T18:38:00.464556934Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:00.465095 containerd[1626]: time="2025-12-12T18:38:00.465050400Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 18:38:00.465162 containerd[1626]: time="2025-12-12T18:38:00.465117091Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 18:38:00.465297 kubelet[2916]: E1212 18:38:00.465268 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:38:00.465550 kubelet[2916]: E1212 18:38:00.465316 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:38:00.465848 kubelet[2916]: E1212 18:38:00.465442 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhlst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kv4lq_calico-system(8031345b-ae64-4c92-b720-42f91263ef98): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:00.467700 containerd[1626]: time="2025-12-12T18:38:00.467668274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 18:38:00.815772 containerd[1626]: time="2025-12-12T18:38:00.815644203Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:00.816058 containerd[1626]: time="2025-12-12T18:38:00.816015181Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 18:38:00.816096 containerd[1626]: time="2025-12-12T18:38:00.816072120Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 18:38:00.816267 kubelet[2916]: E1212 18:38:00.816220 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:38:00.816304 kubelet[2916]: E1212 18:38:00.816267 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:38:00.816397 kubelet[2916]: E1212 18:38:00.816340 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhlst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kv4lq_calico-system(8031345b-ae64-4c92-b720-42f91263ef98): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:00.817676 kubelet[2916]: E1212 18:38:00.817650 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:38:01.101467 containerd[1626]: time="2025-12-12T18:38:01.101359312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 18:38:01.444101 containerd[1626]: time="2025-12-12T18:38:01.443897375Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:01.453692 containerd[1626]: time="2025-12-12T18:38:01.453630470Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 18:38:01.453692 containerd[1626]: time="2025-12-12T18:38:01.453680722Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 18:38:01.453828 kubelet[2916]: E1212 18:38:01.453768 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:38:01.453828 kubelet[2916]: E1212 18:38:01.453821 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:38:01.453976 kubelet[2916]: E1212 18:38:01.453924 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6dh2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-pczf2_calico-system(add8767b-e2e5-42f4-972c-f607ea5079b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:01.455029 kubelet[2916]: E1212 18:38:01.455008 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:38:04.100008 containerd[1626]: time="2025-12-12T18:38:04.099208087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 18:38:04.444513 containerd[1626]: time="2025-12-12T18:38:04.444407772Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:04.444917 containerd[1626]: time="2025-12-12T18:38:04.444881987Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 18:38:04.444958 containerd[1626]: time="2025-12-12T18:38:04.444944575Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 18:38:04.445080 kubelet[2916]: E1212 18:38:04.445039 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:38:04.445633 kubelet[2916]: E1212 18:38:04.445082 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:38:04.445656 containerd[1626]: time="2025-12-12T18:38:04.445286816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:38:04.445823 kubelet[2916]: E1212 18:38:04.445764 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdl6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6989cc9fb7-8wtsx_calico-system(8796fb6d-23f3-4add-b202-c640f2b3e6d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:04.447023 kubelet[2916]: E1212 18:38:04.446992 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:38:04.758567 containerd[1626]: time="2025-12-12T18:38:04.758348404Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:04.758849 containerd[1626]: time="2025-12-12T18:38:04.758827911Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:38:04.758956 containerd[1626]: time="2025-12-12T18:38:04.758889757Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:38:04.759118 kubelet[2916]: E1212 18:38:04.759085 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:38:04.759179 kubelet[2916]: E1212 18:38:04.759127 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:38:04.759241 kubelet[2916]: E1212 18:38:04.759209 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vrks7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-654cf49445-kj5np_calico-apiserver(29190c43-36b5-4106-a010-9c89c155f184): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:04.760432 kubelet[2916]: E1212 18:38:04.760399 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:38:09.099114 kubelet[2916]: E1212 18:38:09.098941 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:38:10.098652 kubelet[2916]: E1212 18:38:10.098464 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:38:15.098978 kubelet[2916]: E1212 18:38:15.098622 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:38:16.099179 kubelet[2916]: E1212 18:38:16.099090 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:38:16.100633 kubelet[2916]: E1212 18:38:16.100008 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:38:19.099437 kubelet[2916]: E1212 18:38:19.099404 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:38:20.099262 containerd[1626]: time="2025-12-12T18:38:20.099021973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:38:20.453266 containerd[1626]: time="2025-12-12T18:38:20.453004518Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:20.453392 containerd[1626]: time="2025-12-12T18:38:20.453367638Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:38:20.453443 containerd[1626]: time="2025-12-12T18:38:20.453434596Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:38:20.453611 kubelet[2916]: E1212 18:38:20.453557 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:38:20.453888 kubelet[2916]: E1212 18:38:20.453619 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:38:20.454467 kubelet[2916]: E1212 18:38:20.454428 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:492d27a1f7a946e6906773b2f3082bed,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f47hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-d84cf57d5-mhld9_calico-system(808b905f-e397-4649-b7f1-bb0b9a44aaa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:20.456493 containerd[1626]: time="2025-12-12T18:38:20.456423180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:38:20.785988 containerd[1626]: time="2025-12-12T18:38:20.785873829Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:20.788928 containerd[1626]: time="2025-12-12T18:38:20.788900322Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:38:20.788994 containerd[1626]: time="2025-12-12T18:38:20.788962347Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:38:20.789098 kubelet[2916]: E1212 18:38:20.789057 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:38:20.789098 kubelet[2916]: E1212 18:38:20.789095 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:38:20.789395 kubelet[2916]: E1212 18:38:20.789348 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f47hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-d84cf57d5-mhld9_calico-system(808b905f-e397-4649-b7f1-bb0b9a44aaa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:20.790549 kubelet[2916]: E1212 18:38:20.790511 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:38:23.099205 containerd[1626]: time="2025-12-12T18:38:23.099029425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:38:23.448757 containerd[1626]: time="2025-12-12T18:38:23.446388751Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:23.449245 containerd[1626]: time="2025-12-12T18:38:23.449147070Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:38:23.449245 containerd[1626]: time="2025-12-12T18:38:23.449220187Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:38:23.449483 kubelet[2916]: E1212 18:38:23.449309 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:38:23.449483 kubelet[2916]: E1212 18:38:23.449338 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:38:23.449483 kubelet[2916]: E1212 18:38:23.449417 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hlz27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-654cf49445-47vm8_calico-apiserver(7abc4147-fe9a-4147-b541-36ac776f31d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:23.450683 kubelet[2916]: E1212 18:38:23.450663 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:38:27.102713 containerd[1626]: time="2025-12-12T18:38:27.101622605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 18:38:27.491473 containerd[1626]: time="2025-12-12T18:38:27.491266404Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:27.491729 containerd[1626]: time="2025-12-12T18:38:27.491707865Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 18:38:27.491854 containerd[1626]: time="2025-12-12T18:38:27.491757178Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 18:38:27.492992 kubelet[2916]: E1212 18:38:27.492555 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:38:27.492992 kubelet[2916]: E1212 18:38:27.492595 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 18:38:27.492992 kubelet[2916]: E1212 18:38:27.492688 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhlst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kv4lq_calico-system(8031345b-ae64-4c92-b720-42f91263ef98): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:27.494765 containerd[1626]: time="2025-12-12T18:38:27.494748316Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 18:38:27.703964 systemd[1]: Started sshd@7-139.178.70.101:22-147.75.109.163:51796.service - OpenSSH per-connection server daemon (147.75.109.163:51796). Dec 12 18:38:27.858433 containerd[1626]: time="2025-12-12T18:38:27.857973615Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:27.858894 containerd[1626]: time="2025-12-12T18:38:27.858852636Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 18:38:27.858938 containerd[1626]: time="2025-12-12T18:38:27.858912212Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 18:38:27.859110 kubelet[2916]: E1212 18:38:27.859079 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:38:27.859150 kubelet[2916]: E1212 18:38:27.859116 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 18:38:27.859232 kubelet[2916]: E1212 18:38:27.859194 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhlst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-kv4lq_calico-system(8031345b-ae64-4c92-b720-42f91263ef98): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:27.860511 kubelet[2916]: E1212 18:38:27.860483 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:38:27.864511 sshd[5009]: Accepted publickey for core from 147.75.109.163 port 51796 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:38:27.867738 sshd-session[5009]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:38:27.883296 systemd-logind[1608]: New session 10 of user core. Dec 12 18:38:27.891077 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 18:38:28.748912 sshd[5012]: Connection closed by 147.75.109.163 port 51796 Dec 12 18:38:28.749119 sshd-session[5009]: pam_unix(sshd:session): session closed for user core Dec 12 18:38:28.757539 systemd[1]: sshd@7-139.178.70.101:22-147.75.109.163:51796.service: Deactivated successfully. Dec 12 18:38:28.761349 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 18:38:28.762776 systemd-logind[1608]: Session 10 logged out. Waiting for processes to exit. Dec 12 18:38:28.763851 systemd-logind[1608]: Removed session 10. Dec 12 18:38:29.116964 containerd[1626]: time="2025-12-12T18:38:29.116878445Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 18:38:29.470587 containerd[1626]: time="2025-12-12T18:38:29.470445155Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:29.475246 containerd[1626]: time="2025-12-12T18:38:29.475148938Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 18:38:29.475246 containerd[1626]: time="2025-12-12T18:38:29.475222664Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 18:38:29.475385 kubelet[2916]: E1212 18:38:29.475338 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:38:29.475981 kubelet[2916]: E1212 18:38:29.475385 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 18:38:29.476009 containerd[1626]: time="2025-12-12T18:38:29.475631283Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 18:38:29.476100 kubelet[2916]: E1212 18:38:29.476067 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdl6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6989cc9fb7-8wtsx_calico-system(8796fb6d-23f3-4add-b202-c640f2b3e6d5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:29.477242 kubelet[2916]: E1212 18:38:29.477224 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:38:29.825630 containerd[1626]: time="2025-12-12T18:38:29.825597297Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:29.826158 containerd[1626]: time="2025-12-12T18:38:29.826071577Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 18:38:29.826158 containerd[1626]: time="2025-12-12T18:38:29.826135848Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 18:38:29.826282 kubelet[2916]: E1212 18:38:29.826252 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:38:29.826347 kubelet[2916]: E1212 18:38:29.826292 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 18:38:29.826489 kubelet[2916]: E1212 18:38:29.826446 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6dh2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-pczf2_calico-system(add8767b-e2e5-42f4-972c-f607ea5079b5): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:29.827977 kubelet[2916]: E1212 18:38:29.827921 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:38:33.098292 containerd[1626]: time="2025-12-12T18:38:33.098269178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:38:33.098972 kubelet[2916]: E1212 18:38:33.098224 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:38:33.506729 containerd[1626]: time="2025-12-12T18:38:33.506592901Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:38:33.516096 containerd[1626]: time="2025-12-12T18:38:33.516065990Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:38:33.516233 containerd[1626]: time="2025-12-12T18:38:33.516120226Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:38:33.516369 kubelet[2916]: E1212 18:38:33.516301 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:38:33.516369 kubelet[2916]: E1212 18:38:33.516330 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:38:33.516556 kubelet[2916]: E1212 18:38:33.516518 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vrks7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-654cf49445-kj5np_calico-apiserver(29190c43-36b5-4106-a010-9c89c155f184): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:38:33.518187 kubelet[2916]: E1212 18:38:33.518165 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:38:33.761217 systemd[1]: Started sshd@8-139.178.70.101:22-147.75.109.163:36570.service - OpenSSH per-connection server daemon (147.75.109.163:36570). Dec 12 18:38:33.817014 sshd[5027]: Accepted publickey for core from 147.75.109.163 port 36570 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:38:33.818472 sshd-session[5027]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:38:33.822695 systemd-logind[1608]: New session 11 of user core. Dec 12 18:38:33.827056 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 18:38:33.943807 sshd[5030]: Connection closed by 147.75.109.163 port 36570 Dec 12 18:38:33.944135 sshd-session[5027]: pam_unix(sshd:session): session closed for user core Dec 12 18:38:33.946637 systemd-logind[1608]: Session 11 logged out. Waiting for processes to exit. Dec 12 18:38:33.947115 systemd[1]: sshd@8-139.178.70.101:22-147.75.109.163:36570.service: Deactivated successfully. Dec 12 18:38:33.948147 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 18:38:33.949436 systemd-logind[1608]: Removed session 11. Dec 12 18:38:34.098463 kubelet[2916]: E1212 18:38:34.097880 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:38:38.953034 systemd[1]: Started sshd@9-139.178.70.101:22-147.75.109.163:36580.service - OpenSSH per-connection server daemon (147.75.109.163:36580). Dec 12 18:38:38.994355 sshd[5042]: Accepted publickey for core from 147.75.109.163 port 36580 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:38:38.995257 sshd-session[5042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:38:38.998092 systemd-logind[1608]: New session 12 of user core. Dec 12 18:38:39.002921 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 18:38:39.102036 sshd[5045]: Connection closed by 147.75.109.163 port 36580 Dec 12 18:38:39.102511 sshd-session[5042]: pam_unix(sshd:session): session closed for user core Dec 12 18:38:39.112041 systemd[1]: sshd@9-139.178.70.101:22-147.75.109.163:36580.service: Deactivated successfully. Dec 12 18:38:39.114075 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 18:38:39.114677 systemd-logind[1608]: Session 12 logged out. Waiting for processes to exit. Dec 12 18:38:39.116662 systemd[1]: Started sshd@10-139.178.70.101:22-147.75.109.163:36596.service - OpenSSH per-connection server daemon (147.75.109.163:36596). Dec 12 18:38:39.117265 systemd-logind[1608]: Removed session 12. Dec 12 18:38:39.182213 sshd[5059]: Accepted publickey for core from 147.75.109.163 port 36596 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:38:39.183959 sshd-session[5059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:38:39.188376 systemd-logind[1608]: New session 13 of user core. Dec 12 18:38:39.192935 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 18:38:39.346871 sshd[5062]: Connection closed by 147.75.109.163 port 36596 Dec 12 18:38:39.347896 sshd-session[5059]: pam_unix(sshd:session): session closed for user core Dec 12 18:38:39.355616 systemd[1]: sshd@10-139.178.70.101:22-147.75.109.163:36596.service: Deactivated successfully. Dec 12 18:38:39.358093 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 18:38:39.359332 systemd-logind[1608]: Session 13 logged out. Waiting for processes to exit. Dec 12 18:38:39.363432 systemd[1]: Started sshd@11-139.178.70.101:22-147.75.109.163:36602.service - OpenSSH per-connection server daemon (147.75.109.163:36602). Dec 12 18:38:39.366290 systemd-logind[1608]: Removed session 13. Dec 12 18:38:39.426396 sshd[5072]: Accepted publickey for core from 147.75.109.163 port 36602 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:38:39.428339 sshd-session[5072]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:38:39.431062 systemd-logind[1608]: New session 14 of user core. Dec 12 18:38:39.436903 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 18:38:39.579508 sshd[5077]: Connection closed by 147.75.109.163 port 36602 Dec 12 18:38:39.579706 sshd-session[5072]: pam_unix(sshd:session): session closed for user core Dec 12 18:38:39.584687 systemd-logind[1608]: Session 14 logged out. Waiting for processes to exit. Dec 12 18:38:39.585283 systemd[1]: sshd@11-139.178.70.101:22-147.75.109.163:36602.service: Deactivated successfully. Dec 12 18:38:39.587296 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 18:38:39.590456 systemd-logind[1608]: Removed session 14. Dec 12 18:38:42.099647 kubelet[2916]: E1212 18:38:42.098985 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:38:43.098799 kubelet[2916]: E1212 18:38:43.098759 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:38:43.098954 kubelet[2916]: E1212 18:38:43.098833 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:38:44.595249 systemd[1]: Started sshd@12-139.178.70.101:22-147.75.109.163:48672.service - OpenSSH per-connection server daemon (147.75.109.163:48672). Dec 12 18:38:44.768165 sshd[5118]: Accepted publickey for core from 147.75.109.163 port 48672 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:38:44.769240 sshd-session[5118]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:38:44.772723 systemd-logind[1608]: New session 15 of user core. Dec 12 18:38:44.779940 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 18:38:44.896114 sshd[5121]: Connection closed by 147.75.109.163 port 48672 Dec 12 18:38:44.896545 sshd-session[5118]: pam_unix(sshd:session): session closed for user core Dec 12 18:38:44.899182 systemd[1]: sshd@12-139.178.70.101:22-147.75.109.163:48672.service: Deactivated successfully. Dec 12 18:38:44.900896 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 18:38:44.901738 systemd-logind[1608]: Session 15 logged out. Waiting for processes to exit. Dec 12 18:38:44.903507 systemd-logind[1608]: Removed session 15. Dec 12 18:38:47.099519 kubelet[2916]: E1212 18:38:47.099443 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:38:47.100249 kubelet[2916]: E1212 18:38:47.100172 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:38:48.101008 kubelet[2916]: E1212 18:38:48.100943 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:38:49.908587 systemd[1]: Started sshd@13-139.178.70.101:22-147.75.109.163:48674.service - OpenSSH per-connection server daemon (147.75.109.163:48674). Dec 12 18:38:49.949284 sshd[5133]: Accepted publickey for core from 147.75.109.163 port 48674 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:38:49.950126 sshd-session[5133]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:38:49.952582 systemd-logind[1608]: New session 16 of user core. Dec 12 18:38:49.959005 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 18:38:50.057590 sshd[5136]: Connection closed by 147.75.109.163 port 48674 Dec 12 18:38:50.057996 sshd-session[5133]: pam_unix(sshd:session): session closed for user core Dec 12 18:38:50.060219 systemd[1]: sshd@13-139.178.70.101:22-147.75.109.163:48674.service: Deactivated successfully. Dec 12 18:38:50.061475 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 18:38:50.062742 systemd-logind[1608]: Session 16 logged out. Waiting for processes to exit. Dec 12 18:38:50.064446 systemd-logind[1608]: Removed session 16. Dec 12 18:38:54.100813 kubelet[2916]: E1212 18:38:54.100732 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:38:54.123536 kubelet[2916]: E1212 18:38:54.123506 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:38:55.072425 systemd[1]: Started sshd@14-139.178.70.101:22-147.75.109.163:60870.service - OpenSSH per-connection server daemon (147.75.109.163:60870). Dec 12 18:38:55.128935 sshd[5148]: Accepted publickey for core from 147.75.109.163 port 60870 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:38:55.130488 sshd-session[5148]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:38:55.136705 systemd-logind[1608]: New session 17 of user core. Dec 12 18:38:55.140294 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 18:38:55.313527 sshd[5151]: Connection closed by 147.75.109.163 port 60870 Dec 12 18:38:55.317233 systemd[1]: sshd@14-139.178.70.101:22-147.75.109.163:60870.service: Deactivated successfully. Dec 12 18:38:55.313989 sshd-session[5148]: pam_unix(sshd:session): session closed for user core Dec 12 18:38:55.320919 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 18:38:55.322267 systemd-logind[1608]: Session 17 logged out. Waiting for processes to exit. Dec 12 18:38:55.323642 systemd-logind[1608]: Removed session 17. Dec 12 18:38:58.099687 kubelet[2916]: E1212 18:38:58.099657 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:39:00.099427 kubelet[2916]: E1212 18:39:00.099283 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:39:00.100872 kubelet[2916]: E1212 18:39:00.100839 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:39:00.328941 systemd[1]: Started sshd@15-139.178.70.101:22-147.75.109.163:60872.service - OpenSSH per-connection server daemon (147.75.109.163:60872). Dec 12 18:39:00.374333 sshd[5165]: Accepted publickey for core from 147.75.109.163 port 60872 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:39:00.375153 sshd-session[5165]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:39:00.378596 systemd-logind[1608]: New session 18 of user core. Dec 12 18:39:00.385942 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 18:39:00.529488 sshd[5168]: Connection closed by 147.75.109.163 port 60872 Dec 12 18:39:00.535832 systemd[1]: Started sshd@16-139.178.70.101:22-147.75.109.163:60886.service - OpenSSH per-connection server daemon (147.75.109.163:60886). Dec 12 18:39:00.550600 sshd-session[5165]: pam_unix(sshd:session): session closed for user core Dec 12 18:39:00.552427 systemd[1]: sshd@15-139.178.70.101:22-147.75.109.163:60872.service: Deactivated successfully. Dec 12 18:39:00.553769 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 18:39:00.554748 systemd-logind[1608]: Session 18 logged out. Waiting for processes to exit. Dec 12 18:39:00.556301 systemd-logind[1608]: Removed session 18. Dec 12 18:39:00.740687 sshd[5177]: Accepted publickey for core from 147.75.109.163 port 60886 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:39:00.741457 sshd-session[5177]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:39:00.744674 systemd-logind[1608]: New session 19 of user core. Dec 12 18:39:00.750892 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 18:39:01.110777 kubelet[2916]: E1212 18:39:01.110510 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:39:01.121471 sshd[5183]: Connection closed by 147.75.109.163 port 60886 Dec 12 18:39:01.126308 sshd-session[5177]: pam_unix(sshd:session): session closed for user core Dec 12 18:39:01.133548 systemd[1]: Started sshd@17-139.178.70.101:22-147.75.109.163:60900.service - OpenSSH per-connection server daemon (147.75.109.163:60900). Dec 12 18:39:01.138887 systemd[1]: sshd@16-139.178.70.101:22-147.75.109.163:60886.service: Deactivated successfully. Dec 12 18:39:01.143985 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 18:39:01.146315 systemd-logind[1608]: Session 19 logged out. Waiting for processes to exit. Dec 12 18:39:01.147379 systemd-logind[1608]: Removed session 19. Dec 12 18:39:01.201616 sshd[5190]: Accepted publickey for core from 147.75.109.163 port 60900 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:39:01.202656 sshd-session[5190]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:39:01.205639 systemd-logind[1608]: New session 20 of user core. Dec 12 18:39:01.210955 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 18:39:01.886751 sshd[5199]: Connection closed by 147.75.109.163 port 60900 Dec 12 18:39:01.887399 sshd-session[5190]: pam_unix(sshd:session): session closed for user core Dec 12 18:39:01.894065 systemd[1]: sshd@17-139.178.70.101:22-147.75.109.163:60900.service: Deactivated successfully. Dec 12 18:39:01.896292 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 18:39:01.898914 systemd-logind[1608]: Session 20 logged out. Waiting for processes to exit. Dec 12 18:39:01.901566 systemd[1]: Started sshd@18-139.178.70.101:22-147.75.109.163:60902.service - OpenSSH per-connection server daemon (147.75.109.163:60902). Dec 12 18:39:01.903605 systemd-logind[1608]: Removed session 20. Dec 12 18:39:01.990954 sshd[5213]: Accepted publickey for core from 147.75.109.163 port 60902 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:39:01.992228 sshd-session[5213]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:39:01.996206 systemd-logind[1608]: New session 21 of user core. Dec 12 18:39:02.001985 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 12 18:39:02.299420 sshd[5219]: Connection closed by 147.75.109.163 port 60902 Dec 12 18:39:02.299821 sshd-session[5213]: pam_unix(sshd:session): session closed for user core Dec 12 18:39:02.312153 systemd[1]: sshd@18-139.178.70.101:22-147.75.109.163:60902.service: Deactivated successfully. Dec 12 18:39:02.315679 systemd[1]: session-21.scope: Deactivated successfully. Dec 12 18:39:02.316469 systemd-logind[1608]: Session 21 logged out. Waiting for processes to exit. Dec 12 18:39:02.320006 systemd[1]: Started sshd@19-139.178.70.101:22-147.75.109.163:43782.service - OpenSSH per-connection server daemon (147.75.109.163:43782). Dec 12 18:39:02.323079 systemd-logind[1608]: Removed session 21. Dec 12 18:39:02.388662 sshd[5237]: Accepted publickey for core from 147.75.109.163 port 43782 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:39:02.389548 sshd-session[5237]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:39:02.392414 systemd-logind[1608]: New session 22 of user core. Dec 12 18:39:02.399900 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 12 18:39:02.492981 sshd[5240]: Connection closed by 147.75.109.163 port 43782 Dec 12 18:39:02.493357 sshd-session[5237]: pam_unix(sshd:session): session closed for user core Dec 12 18:39:02.495615 systemd[1]: sshd@19-139.178.70.101:22-147.75.109.163:43782.service: Deactivated successfully. Dec 12 18:39:02.497172 systemd[1]: session-22.scope: Deactivated successfully. Dec 12 18:39:02.498155 systemd-logind[1608]: Session 22 logged out. Waiting for processes to exit. Dec 12 18:39:02.499404 systemd-logind[1608]: Removed session 22. Dec 12 18:39:07.098879 kubelet[2916]: E1212 18:39:07.098663 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-kv4lq" podUID="8031345b-ae64-4c92-b720-42f91263ef98" Dec 12 18:39:07.505168 systemd[1]: Started sshd@20-139.178.70.101:22-147.75.109.163:43790.service - OpenSSH per-connection server daemon (147.75.109.163:43790). Dec 12 18:39:07.554481 sshd[5253]: Accepted publickey for core from 147.75.109.163 port 43790 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:39:07.555482 sshd-session[5253]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:39:07.558203 systemd-logind[1608]: New session 23 of user core. Dec 12 18:39:07.561934 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 12 18:39:07.701164 sshd[5256]: Connection closed by 147.75.109.163 port 43790 Dec 12 18:39:07.704544 systemd-logind[1608]: Session 23 logged out. Waiting for processes to exit. Dec 12 18:39:07.702300 sshd-session[5253]: pam_unix(sshd:session): session closed for user core Dec 12 18:39:07.704951 systemd[1]: sshd@20-139.178.70.101:22-147.75.109.163:43790.service: Deactivated successfully. Dec 12 18:39:07.706038 systemd[1]: session-23.scope: Deactivated successfully. Dec 12 18:39:07.706853 systemd-logind[1608]: Removed session 23. Dec 12 18:39:09.098015 kubelet[2916]: E1212 18:39:09.097982 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-pczf2" podUID="add8767b-e2e5-42f4-972c-f607ea5079b5" Dec 12 18:39:09.098286 kubelet[2916]: E1212 18:39:09.098109 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6989cc9fb7-8wtsx" podUID="8796fb6d-23f3-4add-b202-c640f2b3e6d5" Dec 12 18:39:12.099231 containerd[1626]: time="2025-12-12T18:39:12.098852601Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 18:39:12.472821 containerd[1626]: time="2025-12-12T18:39:12.472628637Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:39:12.473019 containerd[1626]: time="2025-12-12T18:39:12.472997620Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 18:39:12.473075 containerd[1626]: time="2025-12-12T18:39:12.473046546Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 18:39:12.473627 kubelet[2916]: E1212 18:39:12.473145 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:39:12.473627 kubelet[2916]: E1212 18:39:12.473177 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 18:39:12.473627 kubelet[2916]: E1212 18:39:12.473273 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:492d27a1f7a946e6906773b2f3082bed,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f47hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-d84cf57d5-mhld9_calico-system(808b905f-e397-4649-b7f1-bb0b9a44aaa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 18:39:12.475313 containerd[1626]: time="2025-12-12T18:39:12.475287301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 18:39:12.717275 systemd[1]: Started sshd@21-139.178.70.101:22-147.75.109.163:56336.service - OpenSSH per-connection server daemon (147.75.109.163:56336). Dec 12 18:39:12.768262 sshd[5297]: Accepted publickey for core from 147.75.109.163 port 56336 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:39:12.771039 sshd-session[5297]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:39:12.779483 systemd-logind[1608]: New session 24 of user core. Dec 12 18:39:12.783933 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 12 18:39:12.825397 containerd[1626]: time="2025-12-12T18:39:12.825369839Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:39:12.825914 containerd[1626]: time="2025-12-12T18:39:12.825892647Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 18:39:12.826025 containerd[1626]: time="2025-12-12T18:39:12.825966334Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 18:39:12.826189 kubelet[2916]: E1212 18:39:12.826166 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:39:12.827820 kubelet[2916]: E1212 18:39:12.827804 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 18:39:12.828576 kubelet[2916]: E1212 18:39:12.827997 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f47hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-d84cf57d5-mhld9_calico-system(808b905f-e397-4649-b7f1-bb0b9a44aaa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 18:39:12.829333 kubelet[2916]: E1212 18:39:12.829309 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-d84cf57d5-mhld9" podUID="808b905f-e397-4649-b7f1-bb0b9a44aaa4" Dec 12 18:39:12.919020 sshd[5300]: Connection closed by 147.75.109.163 port 56336 Dec 12 18:39:12.919392 sshd-session[5297]: pam_unix(sshd:session): session closed for user core Dec 12 18:39:12.921592 systemd[1]: sshd@21-139.178.70.101:22-147.75.109.163:56336.service: Deactivated successfully. Dec 12 18:39:12.923751 systemd[1]: session-24.scope: Deactivated successfully. Dec 12 18:39:12.925555 systemd-logind[1608]: Session 24 logged out. Waiting for processes to exit. Dec 12 18:39:12.926409 systemd-logind[1608]: Removed session 24. Dec 12 18:39:13.099530 kubelet[2916]: E1212 18:39:13.098831 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-kj5np" podUID="29190c43-36b5-4106-a010-9c89c155f184" Dec 12 18:39:13.099969 containerd[1626]: time="2025-12-12T18:39:13.099829502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 18:39:13.531451 containerd[1626]: time="2025-12-12T18:39:13.531240058Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 18:39:13.531711 containerd[1626]: time="2025-12-12T18:39:13.531685898Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 18:39:13.531864 containerd[1626]: time="2025-12-12T18:39:13.531774687Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 18:39:13.532056 kubelet[2916]: E1212 18:39:13.532007 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:39:13.532302 kubelet[2916]: E1212 18:39:13.532062 2916 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 18:39:13.532302 kubelet[2916]: E1212 18:39:13.532160 2916 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hlz27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-654cf49445-47vm8_calico-apiserver(7abc4147-fe9a-4147-b541-36ac776f31d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 18:39:13.534138 kubelet[2916]: E1212 18:39:13.534105 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-654cf49445-47vm8" podUID="7abc4147-fe9a-4147-b541-36ac776f31d2" Dec 12 18:39:17.929675 systemd[1]: Started sshd@22-139.178.70.101:22-147.75.109.163:56346.service - OpenSSH per-connection server daemon (147.75.109.163:56346). Dec 12 18:39:17.993797 sshd[5325]: Accepted publickey for core from 147.75.109.163 port 56346 ssh2: RSA SHA256:+6I0r0vywr0geDH5ZUg5YPBr1cYJhliHmsQkYIMTWME Dec 12 18:39:17.993941 sshd-session[5325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 18:39:17.998545 systemd-logind[1608]: New session 25 of user core. Dec 12 18:39:18.000900 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 12 18:39:18.133543 sshd[5335]: Connection closed by 147.75.109.163 port 56346 Dec 12 18:39:18.133914 sshd-session[5325]: pam_unix(sshd:session): session closed for user core Dec 12 18:39:18.136284 systemd-logind[1608]: Session 25 logged out. Waiting for processes to exit. Dec 12 18:39:18.136479 systemd[1]: sshd@22-139.178.70.101:22-147.75.109.163:56346.service: Deactivated successfully. Dec 12 18:39:18.137566 systemd[1]: session-25.scope: Deactivated successfully. Dec 12 18:39:18.139724 systemd-logind[1608]: Removed session 25.