Oct 30 00:20:16.714835 kernel: Linux version 6.12.54-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed Oct 29 22:07:32 -00 2025 Oct 30 00:20:16.714856 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e5fe4ef982f4bbc75df9f63e805c4ec086c6d95878919f55fe8c638c4d2b3b13 Oct 30 00:20:16.714863 kernel: Disabled fast string operations Oct 30 00:20:16.714867 kernel: BIOS-provided physical RAM map: Oct 30 00:20:16.714871 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Oct 30 00:20:16.714875 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Oct 30 00:20:16.714880 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Oct 30 00:20:16.714885 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Oct 30 00:20:16.714890 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Oct 30 00:20:16.714894 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Oct 30 00:20:16.714898 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Oct 30 00:20:16.714902 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Oct 30 00:20:16.714906 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Oct 30 00:20:16.714916 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Oct 30 00:20:16.714924 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Oct 30 00:20:16.714929 kernel: NX (Execute Disable) protection: active Oct 30 00:20:16.714933 kernel: APIC: Static calls initialized Oct 30 00:20:16.714938 kernel: SMBIOS 2.7 present. Oct 30 00:20:16.714943 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Oct 30 00:20:16.714948 kernel: DMI: Memory slots populated: 1/128 Oct 30 00:20:16.714953 kernel: vmware: hypercall mode: 0x00 Oct 30 00:20:16.714958 kernel: Hypervisor detected: VMware Oct 30 00:20:16.714962 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Oct 30 00:20:16.714968 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Oct 30 00:20:16.714973 kernel: vmware: using clock offset of 3474882744 ns Oct 30 00:20:16.714978 kernel: tsc: Detected 3408.000 MHz processor Oct 30 00:20:16.714983 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Oct 30 00:20:16.714988 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Oct 30 00:20:16.714993 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Oct 30 00:20:16.714998 kernel: total RAM covered: 3072M Oct 30 00:20:16.715003 kernel: Found optimal setting for mtrr clean up Oct 30 00:20:16.715008 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Oct 30 00:20:16.715013 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Oct 30 00:20:16.715020 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Oct 30 00:20:16.715028 kernel: Using GB pages for direct mapping Oct 30 00:20:16.715033 kernel: ACPI: Early table checksum verification disabled Oct 30 00:20:16.715038 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Oct 30 00:20:16.715043 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Oct 30 00:20:16.715048 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Oct 30 00:20:16.715056 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Oct 30 00:20:16.715064 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Oct 30 00:20:16.715078 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Oct 30 00:20:16.715084 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Oct 30 00:20:16.715090 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Oct 30 00:20:16.715095 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Oct 30 00:20:16.715100 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Oct 30 00:20:16.715105 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Oct 30 00:20:16.715112 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Oct 30 00:20:16.715117 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Oct 30 00:20:16.715317 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Oct 30 00:20:16.715323 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Oct 30 00:20:16.715328 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Oct 30 00:20:16.715333 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Oct 30 00:20:16.715339 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Oct 30 00:20:16.715344 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Oct 30 00:20:16.715349 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Oct 30 00:20:16.715356 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Oct 30 00:20:16.715362 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Oct 30 00:20:16.715367 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Oct 30 00:20:16.715372 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Oct 30 00:20:16.715377 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Oct 30 00:20:16.715382 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] Oct 30 00:20:16.715387 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] Oct 30 00:20:16.715393 kernel: Zone ranges: Oct 30 00:20:16.715398 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Oct 30 00:20:16.715404 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Oct 30 00:20:16.715409 kernel: Normal empty Oct 30 00:20:16.715414 kernel: Device empty Oct 30 00:20:16.715420 kernel: Movable zone start for each node Oct 30 00:20:16.715425 kernel: Early memory node ranges Oct 30 00:20:16.715430 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Oct 30 00:20:16.715435 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Oct 30 00:20:16.715440 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Oct 30 00:20:16.715445 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Oct 30 00:20:16.715450 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Oct 30 00:20:16.715456 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Oct 30 00:20:16.715462 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Oct 30 00:20:16.715467 kernel: ACPI: PM-Timer IO Port: 0x1008 Oct 30 00:20:16.715472 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Oct 30 00:20:16.715477 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Oct 30 00:20:16.715482 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Oct 30 00:20:16.715487 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Oct 30 00:20:16.715492 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Oct 30 00:20:16.715497 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Oct 30 00:20:16.715503 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Oct 30 00:20:16.715508 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Oct 30 00:20:16.715513 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Oct 30 00:20:16.715518 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Oct 30 00:20:16.715523 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Oct 30 00:20:16.715528 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Oct 30 00:20:16.715533 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Oct 30 00:20:16.715538 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Oct 30 00:20:16.715543 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Oct 30 00:20:16.715548 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Oct 30 00:20:16.715555 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Oct 30 00:20:16.715560 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Oct 30 00:20:16.715565 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Oct 30 00:20:16.715570 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Oct 30 00:20:16.715575 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Oct 30 00:20:16.715585 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Oct 30 00:20:16.715592 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Oct 30 00:20:16.715597 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Oct 30 00:20:16.715602 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Oct 30 00:20:16.715608 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Oct 30 00:20:16.715615 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Oct 30 00:20:16.715620 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Oct 30 00:20:16.715625 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Oct 30 00:20:16.715630 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Oct 30 00:20:16.715635 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Oct 30 00:20:16.715640 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Oct 30 00:20:16.715645 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Oct 30 00:20:16.715650 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Oct 30 00:20:16.715655 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Oct 30 00:20:16.715660 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Oct 30 00:20:16.715667 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Oct 30 00:20:16.715672 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Oct 30 00:20:16.715677 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Oct 30 00:20:16.715682 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Oct 30 00:20:16.715691 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Oct 30 00:20:16.715698 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Oct 30 00:20:16.715704 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Oct 30 00:20:16.715712 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Oct 30 00:20:16.715720 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Oct 30 00:20:16.715726 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Oct 30 00:20:16.715731 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Oct 30 00:20:16.715736 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Oct 30 00:20:16.715742 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Oct 30 00:20:16.715747 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Oct 30 00:20:16.715753 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Oct 30 00:20:16.715758 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Oct 30 00:20:16.715763 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Oct 30 00:20:16.715769 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Oct 30 00:20:16.715775 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Oct 30 00:20:16.715781 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Oct 30 00:20:16.715786 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Oct 30 00:20:16.715791 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Oct 30 00:20:16.715797 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Oct 30 00:20:16.715802 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Oct 30 00:20:16.715807 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Oct 30 00:20:16.715813 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Oct 30 00:20:16.715818 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Oct 30 00:20:16.715824 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Oct 30 00:20:16.715830 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Oct 30 00:20:16.715836 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Oct 30 00:20:16.715841 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Oct 30 00:20:16.715846 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Oct 30 00:20:16.715852 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Oct 30 00:20:16.715857 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Oct 30 00:20:16.715862 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Oct 30 00:20:16.715868 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Oct 30 00:20:16.715874 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Oct 30 00:20:16.715883 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Oct 30 00:20:16.715889 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Oct 30 00:20:16.715894 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Oct 30 00:20:16.715899 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Oct 30 00:20:16.715904 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Oct 30 00:20:16.715910 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Oct 30 00:20:16.715915 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Oct 30 00:20:16.715921 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Oct 30 00:20:16.715926 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Oct 30 00:20:16.715932 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Oct 30 00:20:16.715941 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Oct 30 00:20:16.715950 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Oct 30 00:20:16.715955 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Oct 30 00:20:16.716922 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Oct 30 00:20:16.716930 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Oct 30 00:20:16.716938 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Oct 30 00:20:16.716946 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Oct 30 00:20:16.716952 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Oct 30 00:20:16.716957 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Oct 30 00:20:16.716963 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Oct 30 00:20:16.716970 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Oct 30 00:20:16.716976 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Oct 30 00:20:16.716981 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Oct 30 00:20:16.716987 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Oct 30 00:20:16.716992 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Oct 30 00:20:16.716998 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Oct 30 00:20:16.717003 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Oct 30 00:20:16.717008 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Oct 30 00:20:16.717014 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Oct 30 00:20:16.717019 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Oct 30 00:20:16.717026 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Oct 30 00:20:16.717031 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Oct 30 00:20:16.717037 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Oct 30 00:20:16.717042 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Oct 30 00:20:16.717047 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Oct 30 00:20:16.717053 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Oct 30 00:20:16.717058 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Oct 30 00:20:16.717064 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Oct 30 00:20:16.717069 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Oct 30 00:20:16.717075 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Oct 30 00:20:16.717081 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Oct 30 00:20:16.717087 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Oct 30 00:20:16.717092 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Oct 30 00:20:16.717097 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Oct 30 00:20:16.717103 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Oct 30 00:20:16.717108 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Oct 30 00:20:16.717114 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Oct 30 00:20:16.717119 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Oct 30 00:20:16.717139 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Oct 30 00:20:16.717146 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Oct 30 00:20:16.717151 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Oct 30 00:20:16.717157 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Oct 30 00:20:16.717162 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Oct 30 00:20:16.717168 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Oct 30 00:20:16.717173 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Oct 30 00:20:16.717179 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Oct 30 00:20:16.717184 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Oct 30 00:20:16.717190 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Oct 30 00:20:16.717196 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Oct 30 00:20:16.717202 kernel: TSC deadline timer available Oct 30 00:20:16.717208 kernel: CPU topo: Max. logical packages: 128 Oct 30 00:20:16.717213 kernel: CPU topo: Max. logical dies: 128 Oct 30 00:20:16.717219 kernel: CPU topo: Max. dies per package: 1 Oct 30 00:20:16.717224 kernel: CPU topo: Max. threads per core: 1 Oct 30 00:20:16.717229 kernel: CPU topo: Num. cores per package: 1 Oct 30 00:20:16.717235 kernel: CPU topo: Num. threads per package: 1 Oct 30 00:20:16.717240 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs Oct 30 00:20:16.717246 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Oct 30 00:20:16.717252 kernel: Booting paravirtualized kernel on VMware hypervisor Oct 30 00:20:16.717258 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Oct 30 00:20:16.717264 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Oct 30 00:20:16.717269 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Oct 30 00:20:16.717275 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Oct 30 00:20:16.717280 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Oct 30 00:20:16.717286 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Oct 30 00:20:16.717291 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Oct 30 00:20:16.717297 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Oct 30 00:20:16.717303 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Oct 30 00:20:16.717308 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Oct 30 00:20:16.717314 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Oct 30 00:20:16.717319 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Oct 30 00:20:16.717324 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Oct 30 00:20:16.717330 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Oct 30 00:20:16.717335 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Oct 30 00:20:16.717341 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Oct 30 00:20:16.717346 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Oct 30 00:20:16.717352 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Oct 30 00:20:16.717358 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Oct 30 00:20:16.717363 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Oct 30 00:20:16.717369 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e5fe4ef982f4bbc75df9f63e805c4ec086c6d95878919f55fe8c638c4d2b3b13 Oct 30 00:20:16.717375 kernel: random: crng init done Oct 30 00:20:16.717381 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Oct 30 00:20:16.717386 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Oct 30 00:20:16.717392 kernel: printk: log_buf_len min size: 262144 bytes Oct 30 00:20:16.717398 kernel: printk: log_buf_len: 1048576 bytes Oct 30 00:20:16.717404 kernel: printk: early log buf free: 245704(93%) Oct 30 00:20:16.717409 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 30 00:20:16.717415 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Oct 30 00:20:16.717421 kernel: Fallback order for Node 0: 0 Oct 30 00:20:16.717426 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 Oct 30 00:20:16.717432 kernel: Policy zone: DMA32 Oct 30 00:20:16.717437 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Oct 30 00:20:16.717443 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Oct 30 00:20:16.717449 kernel: ftrace: allocating 40021 entries in 157 pages Oct 30 00:20:16.717455 kernel: ftrace: allocated 157 pages with 5 groups Oct 30 00:20:16.717460 kernel: Dynamic Preempt: voluntary Oct 30 00:20:16.717466 kernel: rcu: Preemptible hierarchical RCU implementation. Oct 30 00:20:16.717472 kernel: rcu: RCU event tracing is enabled. Oct 30 00:20:16.717478 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Oct 30 00:20:16.717483 kernel: Trampoline variant of Tasks RCU enabled. Oct 30 00:20:16.717489 kernel: Rude variant of Tasks RCU enabled. Oct 30 00:20:16.717494 kernel: Tracing variant of Tasks RCU enabled. Oct 30 00:20:16.717500 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Oct 30 00:20:16.717506 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Oct 30 00:20:16.717512 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 30 00:20:16.717517 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 30 00:20:16.717523 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Oct 30 00:20:16.717529 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Oct 30 00:20:16.717534 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Oct 30 00:20:16.717540 kernel: Console: colour VGA+ 80x25 Oct 30 00:20:16.717545 kernel: printk: legacy console [tty0] enabled Oct 30 00:20:16.717552 kernel: printk: legacy console [ttyS0] enabled Oct 30 00:20:16.717557 kernel: ACPI: Core revision 20240827 Oct 30 00:20:16.717563 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Oct 30 00:20:16.717569 kernel: APIC: Switch to symmetric I/O mode setup Oct 30 00:20:16.717574 kernel: x2apic enabled Oct 30 00:20:16.717580 kernel: APIC: Switched APIC routing to: physical x2apic Oct 30 00:20:16.717585 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Oct 30 00:20:16.717591 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Oct 30 00:20:16.717597 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Oct 30 00:20:16.717603 kernel: Disabled fast string operations Oct 30 00:20:16.717609 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Oct 30 00:20:16.717614 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Oct 30 00:20:16.717620 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Oct 30 00:20:16.717625 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Oct 30 00:20:16.717631 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Oct 30 00:20:16.717636 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Oct 30 00:20:16.717642 kernel: RETBleed: Mitigation: Enhanced IBRS Oct 30 00:20:16.717648 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Oct 30 00:20:16.717654 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Oct 30 00:20:16.717660 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Oct 30 00:20:16.717665 kernel: SRBDS: Unknown: Dependent on hypervisor status Oct 30 00:20:16.717671 kernel: GDS: Unknown: Dependent on hypervisor status Oct 30 00:20:16.717676 kernel: active return thunk: its_return_thunk Oct 30 00:20:16.717682 kernel: ITS: Mitigation: Aligned branch/return thunks Oct 30 00:20:16.717687 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Oct 30 00:20:16.717693 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Oct 30 00:20:16.717698 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Oct 30 00:20:16.717705 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Oct 30 00:20:16.717710 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Oct 30 00:20:16.717716 kernel: Freeing SMP alternatives memory: 32K Oct 30 00:20:16.717721 kernel: pid_max: default: 131072 minimum: 1024 Oct 30 00:20:16.717727 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Oct 30 00:20:16.717733 kernel: landlock: Up and running. Oct 30 00:20:16.717738 kernel: SELinux: Initializing. Oct 30 00:20:16.717743 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Oct 30 00:20:16.717749 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Oct 30 00:20:16.717756 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Oct 30 00:20:16.717761 kernel: Performance Events: Skylake events, core PMU driver. Oct 30 00:20:16.717767 kernel: core: CPUID marked event: 'cpu cycles' unavailable Oct 30 00:20:16.717772 kernel: core: CPUID marked event: 'instructions' unavailable Oct 30 00:20:16.717778 kernel: core: CPUID marked event: 'bus cycles' unavailable Oct 30 00:20:16.717783 kernel: core: CPUID marked event: 'cache references' unavailable Oct 30 00:20:16.717789 kernel: core: CPUID marked event: 'cache misses' unavailable Oct 30 00:20:16.717794 kernel: core: CPUID marked event: 'branch instructions' unavailable Oct 30 00:20:16.717799 kernel: core: CPUID marked event: 'branch misses' unavailable Oct 30 00:20:16.717806 kernel: ... version: 1 Oct 30 00:20:16.717811 kernel: ... bit width: 48 Oct 30 00:20:16.717817 kernel: ... generic registers: 4 Oct 30 00:20:16.717822 kernel: ... value mask: 0000ffffffffffff Oct 30 00:20:16.717828 kernel: ... max period: 000000007fffffff Oct 30 00:20:16.717835 kernel: ... fixed-purpose events: 0 Oct 30 00:20:16.717842 kernel: ... event mask: 000000000000000f Oct 30 00:20:16.717848 kernel: signal: max sigframe size: 1776 Oct 30 00:20:16.717853 kernel: rcu: Hierarchical SRCU implementation. Oct 30 00:20:16.717860 kernel: rcu: Max phase no-delay instances is 400. Oct 30 00:20:16.717866 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level Oct 30 00:20:16.717871 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Oct 30 00:20:16.717877 kernel: smp: Bringing up secondary CPUs ... Oct 30 00:20:16.717882 kernel: smpboot: x86: Booting SMP configuration: Oct 30 00:20:16.717888 kernel: .... node #0, CPUs: #1 Oct 30 00:20:16.717893 kernel: Disabled fast string operations Oct 30 00:20:16.717899 kernel: smp: Brought up 1 node, 2 CPUs Oct 30 00:20:16.717904 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Oct 30 00:20:16.717911 kernel: Memory: 1918116K/2096628K available (14336K kernel code, 2436K rwdata, 26048K rodata, 45544K init, 1184K bss, 167128K reserved, 0K cma-reserved) Oct 30 00:20:16.717917 kernel: devtmpfs: initialized Oct 30 00:20:16.717923 kernel: x86/mm: Memory block size: 128MB Oct 30 00:20:16.717928 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Oct 30 00:20:16.717934 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Oct 30 00:20:16.717939 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Oct 30 00:20:16.717945 kernel: pinctrl core: initialized pinctrl subsystem Oct 30 00:20:16.717951 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Oct 30 00:20:16.717956 kernel: audit: initializing netlink subsys (disabled) Oct 30 00:20:16.717963 kernel: audit: type=2000 audit(1761783613.282:1): state=initialized audit_enabled=0 res=1 Oct 30 00:20:16.717968 kernel: thermal_sys: Registered thermal governor 'step_wise' Oct 30 00:20:16.717974 kernel: thermal_sys: Registered thermal governor 'user_space' Oct 30 00:20:16.717979 kernel: cpuidle: using governor menu Oct 30 00:20:16.717984 kernel: Simple Boot Flag at 0x36 set to 0x80 Oct 30 00:20:16.717990 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Oct 30 00:20:16.717995 kernel: dca service started, version 1.12.1 Oct 30 00:20:16.718001 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] Oct 30 00:20:16.718013 kernel: PCI: Using configuration type 1 for base access Oct 30 00:20:16.718021 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Oct 30 00:20:16.718027 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Oct 30 00:20:16.718033 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Oct 30 00:20:16.718038 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Oct 30 00:20:16.718044 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Oct 30 00:20:16.718050 kernel: ACPI: Added _OSI(Module Device) Oct 30 00:20:16.718056 kernel: ACPI: Added _OSI(Processor Device) Oct 30 00:20:16.718061 kernel: ACPI: Added _OSI(Processor Aggregator Device) Oct 30 00:20:16.718067 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Oct 30 00:20:16.718075 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Oct 30 00:20:16.718082 kernel: ACPI: Interpreter enabled Oct 30 00:20:16.718088 kernel: ACPI: PM: (supports S0 S1 S5) Oct 30 00:20:16.718094 kernel: ACPI: Using IOAPIC for interrupt routing Oct 30 00:20:16.718100 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Oct 30 00:20:16.718105 kernel: PCI: Using E820 reservations for host bridge windows Oct 30 00:20:16.718111 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Oct 30 00:20:16.718117 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Oct 30 00:20:16.718222 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Oct 30 00:20:16.718292 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Oct 30 00:20:16.718344 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Oct 30 00:20:16.718352 kernel: PCI host bridge to bus 0000:00 Oct 30 00:20:16.718403 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Oct 30 00:20:16.718452 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Oct 30 00:20:16.718501 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Oct 30 00:20:16.718547 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Oct 30 00:20:16.718594 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Oct 30 00:20:16.718637 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Oct 30 00:20:16.718697 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint Oct 30 00:20:16.718757 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge Oct 30 00:20:16.718808 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 30 00:20:16.718865 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Oct 30 00:20:16.718920 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint Oct 30 00:20:16.718976 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] Oct 30 00:20:16.719027 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Oct 30 00:20:16.719079 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk Oct 30 00:20:16.719147 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Oct 30 00:20:16.719200 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk Oct 30 00:20:16.719261 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Oct 30 00:20:16.719311 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Oct 30 00:20:16.719360 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Oct 30 00:20:16.719413 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint Oct 30 00:20:16.719467 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] Oct 30 00:20:16.719516 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] Oct 30 00:20:16.719570 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint Oct 30 00:20:16.719620 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] Oct 30 00:20:16.719670 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] Oct 30 00:20:16.719719 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] Oct 30 00:20:16.719769 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] Oct 30 00:20:16.719818 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Oct 30 00:20:16.719872 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge Oct 30 00:20:16.719921 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Oct 30 00:20:16.719970 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Oct 30 00:20:16.720018 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Oct 30 00:20:16.720067 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 30 00:20:16.721011 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.721080 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 30 00:20:16.721144 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Oct 30 00:20:16.721207 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Oct 30 00:20:16.721271 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.721337 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.721391 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 30 00:20:16.721445 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Oct 30 00:20:16.721506 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Oct 30 00:20:16.721560 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Oct 30 00:20:16.721615 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.721676 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.721728 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 30 00:20:16.721781 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Oct 30 00:20:16.721832 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Oct 30 00:20:16.721883 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Oct 30 00:20:16.721933 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.721987 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.722038 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 30 00:20:16.722087 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Oct 30 00:20:16.722157 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Oct 30 00:20:16.722209 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.722266 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.722323 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 30 00:20:16.722373 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Oct 30 00:20:16.722432 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 30 00:20:16.722489 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.722544 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.724150 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 30 00:20:16.724211 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Oct 30 00:20:16.724266 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Oct 30 00:20:16.724322 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.724381 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.724433 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 30 00:20:16.724484 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Oct 30 00:20:16.724538 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Oct 30 00:20:16.724595 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.724660 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.724717 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 30 00:20:16.724769 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Oct 30 00:20:16.724821 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Oct 30 00:20:16.724881 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.724938 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.724989 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 30 00:20:16.725039 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Oct 30 00:20:16.725092 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Oct 30 00:20:16.725175 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.725233 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.725284 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 30 00:20:16.725337 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Oct 30 00:20:16.725388 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Oct 30 00:20:16.725443 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Oct 30 00:20:16.725493 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.725547 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.725603 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 30 00:20:16.725653 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Oct 30 00:20:16.725705 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Oct 30 00:20:16.725755 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Oct 30 00:20:16.725810 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.725866 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.725931 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 30 00:20:16.725981 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Oct 30 00:20:16.726035 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 30 00:20:16.726088 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.726176 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.726229 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 30 00:20:16.726279 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Oct 30 00:20:16.726328 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 30 00:20:16.726378 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.726432 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.726483 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 30 00:20:16.726540 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Oct 30 00:20:16.726591 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Oct 30 00:20:16.726640 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.726694 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.726749 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 30 00:20:16.726807 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Oct 30 00:20:16.726868 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Oct 30 00:20:16.726920 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.726976 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.727026 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 30 00:20:16.727076 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Oct 30 00:20:16.727143 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 30 00:20:16.727199 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.727258 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.727313 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 30 00:20:16.727363 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Oct 30 00:20:16.727413 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Oct 30 00:20:16.727464 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 30 00:20:16.727526 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.727588 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.727641 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 30 00:20:16.727698 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Oct 30 00:20:16.727756 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Oct 30 00:20:16.727807 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Oct 30 00:20:16.727856 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.727914 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.727970 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 30 00:20:16.728020 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Oct 30 00:20:16.728070 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Oct 30 00:20:16.728118 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Oct 30 00:20:16.728184 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.728244 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.728301 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 30 00:20:16.728352 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Oct 30 00:20:16.728402 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 30 00:20:16.728452 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.728505 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.728555 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 30 00:20:16.728609 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Oct 30 00:20:16.728661 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 30 00:20:16.728713 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.728769 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.728820 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 30 00:20:16.728883 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Oct 30 00:20:16.728933 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Oct 30 00:20:16.728995 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.729054 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.729110 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 30 00:20:16.729181 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Oct 30 00:20:16.729233 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Oct 30 00:20:16.729282 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.729338 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.729388 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 30 00:20:16.729441 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Oct 30 00:20:16.729495 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 30 00:20:16.729547 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.729603 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.729657 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 30 00:20:16.729725 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Oct 30 00:20:16.729785 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Oct 30 00:20:16.729844 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Oct 30 00:20:16.729903 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.729958 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.730009 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 30 00:20:16.730058 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Oct 30 00:20:16.730107 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Oct 30 00:20:16.730188 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Oct 30 00:20:16.730245 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.730304 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.730355 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 30 00:20:16.730407 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Oct 30 00:20:16.730461 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Oct 30 00:20:16.730516 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.730582 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.730649 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 30 00:20:16.730708 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Oct 30 00:20:16.730759 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 30 00:20:16.730809 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.730870 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.730921 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 30 00:20:16.730971 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Oct 30 00:20:16.731020 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Oct 30 00:20:16.731072 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.731142 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.731195 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 30 00:20:16.731245 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Oct 30 00:20:16.731294 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Oct 30 00:20:16.731347 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.731411 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.731475 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 30 00:20:16.731530 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Oct 30 00:20:16.731584 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Oct 30 00:20:16.731639 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.731694 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port Oct 30 00:20:16.731744 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 30 00:20:16.731800 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Oct 30 00:20:16.731852 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 30 00:20:16.731912 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.731970 kernel: pci_bus 0000:01: extended config space not accessible Oct 30 00:20:16.732023 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 30 00:20:16.732080 kernel: pci_bus 0000:02: extended config space not accessible Oct 30 00:20:16.732090 kernel: acpiphp: Slot [32] registered Oct 30 00:20:16.732096 kernel: acpiphp: Slot [33] registered Oct 30 00:20:16.732102 kernel: acpiphp: Slot [34] registered Oct 30 00:20:16.732110 kernel: acpiphp: Slot [35] registered Oct 30 00:20:16.732117 kernel: acpiphp: Slot [36] registered Oct 30 00:20:16.732131 kernel: acpiphp: Slot [37] registered Oct 30 00:20:16.732412 kernel: acpiphp: Slot [38] registered Oct 30 00:20:16.732422 kernel: acpiphp: Slot [39] registered Oct 30 00:20:16.732431 kernel: acpiphp: Slot [40] registered Oct 30 00:20:16.732438 kernel: acpiphp: Slot [41] registered Oct 30 00:20:16.732444 kernel: acpiphp: Slot [42] registered Oct 30 00:20:16.732450 kernel: acpiphp: Slot [43] registered Oct 30 00:20:16.732457 kernel: acpiphp: Slot [44] registered Oct 30 00:20:16.732463 kernel: acpiphp: Slot [45] registered Oct 30 00:20:16.732469 kernel: acpiphp: Slot [46] registered Oct 30 00:20:16.732475 kernel: acpiphp: Slot [47] registered Oct 30 00:20:16.732481 kernel: acpiphp: Slot [48] registered Oct 30 00:20:16.732487 kernel: acpiphp: Slot [49] registered Oct 30 00:20:16.732493 kernel: acpiphp: Slot [50] registered Oct 30 00:20:16.732498 kernel: acpiphp: Slot [51] registered Oct 30 00:20:16.732504 kernel: acpiphp: Slot [52] registered Oct 30 00:20:16.732510 kernel: acpiphp: Slot [53] registered Oct 30 00:20:16.732517 kernel: acpiphp: Slot [54] registered Oct 30 00:20:16.732523 kernel: acpiphp: Slot [55] registered Oct 30 00:20:16.732529 kernel: acpiphp: Slot [56] registered Oct 30 00:20:16.732535 kernel: acpiphp: Slot [57] registered Oct 30 00:20:16.732541 kernel: acpiphp: Slot [58] registered Oct 30 00:20:16.732546 kernel: acpiphp: Slot [59] registered Oct 30 00:20:16.732552 kernel: acpiphp: Slot [60] registered Oct 30 00:20:16.732558 kernel: acpiphp: Slot [61] registered Oct 30 00:20:16.732564 kernel: acpiphp: Slot [62] registered Oct 30 00:20:16.732571 kernel: acpiphp: Slot [63] registered Oct 30 00:20:16.732628 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Oct 30 00:20:16.732685 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Oct 30 00:20:16.732737 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Oct 30 00:20:16.732788 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Oct 30 00:20:16.732838 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Oct 30 00:20:16.732892 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Oct 30 00:20:16.732963 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint Oct 30 00:20:16.733024 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] Oct 30 00:20:16.733088 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] Oct 30 00:20:16.733160 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] Oct 30 00:20:16.733215 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Oct 30 00:20:16.733271 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Oct 30 00:20:16.733323 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 30 00:20:16.733376 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 30 00:20:16.733432 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 30 00:20:16.733483 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 30 00:20:16.733535 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 30 00:20:16.733586 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 30 00:20:16.733641 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 30 00:20:16.733692 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 30 00:20:16.733762 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint Oct 30 00:20:16.733819 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] Oct 30 00:20:16.733874 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] Oct 30 00:20:16.733932 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] Oct 30 00:20:16.733990 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] Oct 30 00:20:16.734042 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] Oct 30 00:20:16.734106 kernel: pci 0000:0b:00.0: supports D1 D2 Oct 30 00:20:16.734175 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Oct 30 00:20:16.734237 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Oct 30 00:20:16.734289 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 30 00:20:16.734341 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 30 00:20:16.734396 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 30 00:20:16.734452 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 30 00:20:16.734508 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 30 00:20:16.734565 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 30 00:20:16.734625 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 30 00:20:16.734679 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 30 00:20:16.734730 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 30 00:20:16.734781 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 30 00:20:16.734830 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 30 00:20:16.734887 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 30 00:20:16.734943 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 30 00:20:16.735010 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 30 00:20:16.735077 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 30 00:20:16.735209 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 30 00:20:16.735279 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 30 00:20:16.735346 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 30 00:20:16.735416 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 30 00:20:16.735477 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 30 00:20:16.735553 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 30 00:20:16.735622 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 30 00:20:16.735687 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 30 00:20:16.735740 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 30 00:20:16.735749 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Oct 30 00:20:16.735755 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Oct 30 00:20:16.735761 kernel: ACPI: PCI: Interrupt link LNKB disabled Oct 30 00:20:16.735770 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Oct 30 00:20:16.735777 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Oct 30 00:20:16.735782 kernel: iommu: Default domain type: Translated Oct 30 00:20:16.735790 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Oct 30 00:20:16.735796 kernel: PCI: Using ACPI for IRQ routing Oct 30 00:20:16.735802 kernel: PCI: pci_cache_line_size set to 64 bytes Oct 30 00:20:16.735808 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Oct 30 00:20:16.735814 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Oct 30 00:20:16.735878 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Oct 30 00:20:16.735932 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Oct 30 00:20:16.735990 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Oct 30 00:20:16.736000 kernel: vgaarb: loaded Oct 30 00:20:16.736008 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Oct 30 00:20:16.736014 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Oct 30 00:20:16.736020 kernel: clocksource: Switched to clocksource tsc-early Oct 30 00:20:16.736026 kernel: VFS: Disk quotas dquot_6.6.0 Oct 30 00:20:16.736033 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Oct 30 00:20:16.736040 kernel: pnp: PnP ACPI init Oct 30 00:20:16.736102 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Oct 30 00:20:16.736167 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Oct 30 00:20:16.736216 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Oct 30 00:20:16.736265 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Oct 30 00:20:16.736314 kernel: pnp 00:06: [dma 2] Oct 30 00:20:16.736366 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Oct 30 00:20:16.736412 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Oct 30 00:20:16.736457 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Oct 30 00:20:16.736474 kernel: pnp: PnP ACPI: found 8 devices Oct 30 00:20:16.736481 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Oct 30 00:20:16.736487 kernel: NET: Registered PF_INET protocol family Oct 30 00:20:16.736494 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Oct 30 00:20:16.736501 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Oct 30 00:20:16.736511 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Oct 30 00:20:16.736517 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Oct 30 00:20:16.736523 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Oct 30 00:20:16.736529 kernel: TCP: Hash tables configured (established 16384 bind 16384) Oct 30 00:20:16.736537 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Oct 30 00:20:16.736543 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Oct 30 00:20:16.736549 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Oct 30 00:20:16.736555 kernel: NET: Registered PF_XDP protocol family Oct 30 00:20:16.736611 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Oct 30 00:20:16.736666 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Oct 30 00:20:16.736724 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Oct 30 00:20:16.736778 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Oct 30 00:20:16.736838 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Oct 30 00:20:16.736895 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Oct 30 00:20:16.736945 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Oct 30 00:20:16.737004 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Oct 30 00:20:16.737054 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Oct 30 00:20:16.737109 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Oct 30 00:20:16.739001 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Oct 30 00:20:16.739060 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Oct 30 00:20:16.739136 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Oct 30 00:20:16.739195 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Oct 30 00:20:16.739254 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Oct 30 00:20:16.739307 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Oct 30 00:20:16.739357 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Oct 30 00:20:16.739409 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Oct 30 00:20:16.739465 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Oct 30 00:20:16.739515 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Oct 30 00:20:16.739579 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Oct 30 00:20:16.739631 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Oct 30 00:20:16.739682 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Oct 30 00:20:16.739732 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned Oct 30 00:20:16.739787 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned Oct 30 00:20:16.739839 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.739892 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.739954 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740008 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740064 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740115 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740196 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740247 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740298 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740347 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740401 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740457 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740508 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740563 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740623 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740690 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740742 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740796 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740851 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.740906 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.740956 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.741005 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.741056 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.741105 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.742766 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.742825 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.742903 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.742956 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743020 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743072 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743133 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743195 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743246 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743296 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743354 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743417 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743481 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743539 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743590 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743642 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743699 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743749 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743808 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743858 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.743908 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.743958 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744007 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744064 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744130 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744188 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744238 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744295 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744355 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744408 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744459 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744517 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744569 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744624 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744675 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744730 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744785 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744840 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.744901 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.744957 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.745010 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.745060 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.745120 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.745704 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.745761 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.745813 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.745882 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.745939 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.745993 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.746043 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.746097 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.747185 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.747255 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.747317 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.747377 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.747430 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.747482 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.747534 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.747587 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space Oct 30 00:20:16.747649 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign Oct 30 00:20:16.747723 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 30 00:20:16.747779 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Oct 30 00:20:16.747835 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Oct 30 00:20:16.747894 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Oct 30 00:20:16.747946 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 30 00:20:16.748013 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned Oct 30 00:20:16.748066 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Oct 30 00:20:16.748117 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Oct 30 00:20:16.748176 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Oct 30 00:20:16.748226 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Oct 30 00:20:16.748301 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Oct 30 00:20:16.748363 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Oct 30 00:20:16.748422 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Oct 30 00:20:16.748473 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Oct 30 00:20:16.748529 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Oct 30 00:20:16.748590 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Oct 30 00:20:16.748645 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Oct 30 00:20:16.748695 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Oct 30 00:20:16.748746 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Oct 30 00:20:16.748803 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Oct 30 00:20:16.748855 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Oct 30 00:20:16.748905 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Oct 30 00:20:16.748968 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Oct 30 00:20:16.749019 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 30 00:20:16.749069 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Oct 30 00:20:16.749120 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Oct 30 00:20:16.750532 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Oct 30 00:20:16.750589 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Oct 30 00:20:16.750640 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Oct 30 00:20:16.750690 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Oct 30 00:20:16.750747 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Oct 30 00:20:16.750799 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Oct 30 00:20:16.750849 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Oct 30 00:20:16.750906 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned Oct 30 00:20:16.750958 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Oct 30 00:20:16.751008 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Oct 30 00:20:16.751063 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Oct 30 00:20:16.751113 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Oct 30 00:20:16.751417 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Oct 30 00:20:16.751472 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Oct 30 00:20:16.751523 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Oct 30 00:20:16.751574 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Oct 30 00:20:16.751633 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Oct 30 00:20:16.751688 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Oct 30 00:20:16.751738 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Oct 30 00:20:16.751788 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Oct 30 00:20:16.751847 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Oct 30 00:20:16.751906 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Oct 30 00:20:16.751966 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 30 00:20:16.752019 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Oct 30 00:20:16.752068 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Oct 30 00:20:16.752151 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 30 00:20:16.752213 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Oct 30 00:20:16.752274 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Oct 30 00:20:16.752326 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Oct 30 00:20:16.752378 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Oct 30 00:20:16.752432 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Oct 30 00:20:16.752483 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Oct 30 00:20:16.752556 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Oct 30 00:20:16.753649 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Oct 30 00:20:16.753711 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 30 00:20:16.753766 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Oct 30 00:20:16.753817 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Oct 30 00:20:16.753867 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Oct 30 00:20:16.753925 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 30 00:20:16.753977 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Oct 30 00:20:16.754027 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Oct 30 00:20:16.754080 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Oct 30 00:20:16.754144 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Oct 30 00:20:16.754217 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Oct 30 00:20:16.754268 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Oct 30 00:20:16.754318 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Oct 30 00:20:16.754367 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Oct 30 00:20:16.754422 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Oct 30 00:20:16.754479 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Oct 30 00:20:16.754535 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 30 00:20:16.754599 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Oct 30 00:20:16.754650 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Oct 30 00:20:16.754701 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 30 00:20:16.754753 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Oct 30 00:20:16.755185 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Oct 30 00:20:16.755250 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Oct 30 00:20:16.755311 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Oct 30 00:20:16.755375 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Oct 30 00:20:16.755432 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Oct 30 00:20:16.755484 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Oct 30 00:20:16.755534 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Oct 30 00:20:16.755589 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 30 00:20:16.755642 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Oct 30 00:20:16.755693 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Oct 30 00:20:16.755746 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Oct 30 00:20:16.755806 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Oct 30 00:20:16.755859 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Oct 30 00:20:16.755910 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Oct 30 00:20:16.755960 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Oct 30 00:20:16.756010 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Oct 30 00:20:16.756061 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Oct 30 00:20:16.756111 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Oct 30 00:20:16.757188 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Oct 30 00:20:16.757259 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Oct 30 00:20:16.757314 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Oct 30 00:20:16.757365 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 30 00:20:16.757420 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Oct 30 00:20:16.757470 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Oct 30 00:20:16.757521 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Oct 30 00:20:16.757572 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Oct 30 00:20:16.757623 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Oct 30 00:20:16.757681 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Oct 30 00:20:16.757741 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Oct 30 00:20:16.757797 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Oct 30 00:20:16.757848 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Oct 30 00:20:16.757899 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Oct 30 00:20:16.757949 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Oct 30 00:20:16.758002 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 30 00:20:16.758056 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Oct 30 00:20:16.758101 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Oct 30 00:20:16.759143 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Oct 30 00:20:16.759201 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Oct 30 00:20:16.759248 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Oct 30 00:20:16.759302 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Oct 30 00:20:16.759350 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Oct 30 00:20:16.759399 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Oct 30 00:20:16.759445 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Oct 30 00:20:16.759491 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Oct 30 00:20:16.759536 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Oct 30 00:20:16.759581 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Oct 30 00:20:16.759625 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Oct 30 00:20:16.759677 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Oct 30 00:20:16.759725 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Oct 30 00:20:16.759787 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Oct 30 00:20:16.759840 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Oct 30 00:20:16.759886 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Oct 30 00:20:16.759931 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Oct 30 00:20:16.759996 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Oct 30 00:20:16.760043 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Oct 30 00:20:16.760090 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Oct 30 00:20:16.760162 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Oct 30 00:20:16.760209 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Oct 30 00:20:16.760259 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Oct 30 00:20:16.760306 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Oct 30 00:20:16.760355 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Oct 30 00:20:16.760400 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Oct 30 00:20:16.760455 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Oct 30 00:20:16.760501 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Oct 30 00:20:16.760569 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Oct 30 00:20:16.760619 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Oct 30 00:20:16.760672 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Oct 30 00:20:16.760723 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Oct 30 00:20:16.760767 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Oct 30 00:20:16.760817 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Oct 30 00:20:16.760862 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Oct 30 00:20:16.760907 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Oct 30 00:20:16.760957 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Oct 30 00:20:16.761004 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Oct 30 00:20:16.761051 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Oct 30 00:20:16.761102 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Oct 30 00:20:16.763181 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Oct 30 00:20:16.763252 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Oct 30 00:20:16.763304 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Oct 30 00:20:16.763358 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Oct 30 00:20:16.763409 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Oct 30 00:20:16.763462 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Oct 30 00:20:16.763508 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Oct 30 00:20:16.763558 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Oct 30 00:20:16.763604 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Oct 30 00:20:16.763653 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Oct 30 00:20:16.763701 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Oct 30 00:20:16.763747 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Oct 30 00:20:16.763797 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Oct 30 00:20:16.763843 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Oct 30 00:20:16.763888 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Oct 30 00:20:16.763937 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Oct 30 00:20:16.763984 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Oct 30 00:20:16.764031 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Oct 30 00:20:16.764081 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Oct 30 00:20:16.764796 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Oct 30 00:20:16.764873 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Oct 30 00:20:16.764931 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Oct 30 00:20:16.764984 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Oct 30 00:20:16.765034 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Oct 30 00:20:16.765084 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Oct 30 00:20:16.765165 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Oct 30 00:20:16.765220 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Oct 30 00:20:16.765267 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Oct 30 00:20:16.765572 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Oct 30 00:20:16.765634 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Oct 30 00:20:16.765692 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Oct 30 00:20:16.765756 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Oct 30 00:20:16.765804 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Oct 30 00:20:16.765849 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Oct 30 00:20:16.765899 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Oct 30 00:20:16.765945 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Oct 30 00:20:16.765999 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Oct 30 00:20:16.766046 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Oct 30 00:20:16.766095 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Oct 30 00:20:16.766151 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Oct 30 00:20:16.766201 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Oct 30 00:20:16.766247 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Oct 30 00:20:16.766297 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Oct 30 00:20:16.766349 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Oct 30 00:20:16.766400 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Oct 30 00:20:16.766451 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Oct 30 00:20:16.766525 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Oct 30 00:20:16.766536 kernel: PCI: CLS 32 bytes, default 64 Oct 30 00:20:16.766543 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Oct 30 00:20:16.766551 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Oct 30 00:20:16.766558 kernel: clocksource: Switched to clocksource tsc Oct 30 00:20:16.766563 kernel: Initialise system trusted keyrings Oct 30 00:20:16.766569 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Oct 30 00:20:16.766585 kernel: Key type asymmetric registered Oct 30 00:20:16.766592 kernel: Asymmetric key parser 'x509' registered Oct 30 00:20:16.766599 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Oct 30 00:20:16.766605 kernel: io scheduler mq-deadline registered Oct 30 00:20:16.766610 kernel: io scheduler kyber registered Oct 30 00:20:16.766891 kernel: io scheduler bfq registered Oct 30 00:20:16.766956 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Oct 30 00:20:16.767015 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767078 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Oct 30 00:20:16.767167 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767235 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Oct 30 00:20:16.767294 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767350 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Oct 30 00:20:16.767401 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767454 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Oct 30 00:20:16.767505 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767562 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Oct 30 00:20:16.767613 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767664 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Oct 30 00:20:16.767719 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767781 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Oct 30 00:20:16.767833 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767884 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Oct 30 00:20:16.767936 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.767996 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Oct 30 00:20:16.768052 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.768106 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Oct 30 00:20:16.768542 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.768601 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Oct 30 00:20:16.768654 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.768717 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Oct 30 00:20:16.768771 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.768823 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Oct 30 00:20:16.768873 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.768929 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Oct 30 00:20:16.768980 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769032 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Oct 30 00:20:16.769084 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769155 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Oct 30 00:20:16.769208 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769259 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Oct 30 00:20:16.769309 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769364 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Oct 30 00:20:16.769414 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769466 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Oct 30 00:20:16.769517 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769569 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Oct 30 00:20:16.769620 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769672 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Oct 30 00:20:16.769726 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769778 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Oct 30 00:20:16.769830 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769881 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Oct 30 00:20:16.769932 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.769983 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Oct 30 00:20:16.770034 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.770085 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Oct 30 00:20:16.770152 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.770204 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Oct 30 00:20:16.770255 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.770306 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Oct 30 00:20:16.770358 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.770409 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Oct 30 00:20:16.770459 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.770514 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Oct 30 00:20:16.770565 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.770625 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Oct 30 00:20:16.770676 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.770729 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Oct 30 00:20:16.770780 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Oct 30 00:20:16.770792 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Oct 30 00:20:16.770800 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Oct 30 00:20:16.770806 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Oct 30 00:20:16.770813 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Oct 30 00:20:16.770819 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Oct 30 00:20:16.770825 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Oct 30 00:20:16.770878 kernel: rtc_cmos 00:01: registered as rtc0 Oct 30 00:20:16.770927 kernel: rtc_cmos 00:01: setting system clock to 2025-10-30T00:20:16 UTC (1761783616) Oct 30 00:20:16.770936 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Oct 30 00:20:16.770982 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Oct 30 00:20:16.770991 kernel: intel_pstate: CPU model not supported Oct 30 00:20:16.770997 kernel: NET: Registered PF_INET6 protocol family Oct 30 00:20:16.771003 kernel: Segment Routing with IPv6 Oct 30 00:20:16.771010 kernel: In-situ OAM (IOAM) with IPv6 Oct 30 00:20:16.771016 kernel: NET: Registered PF_PACKET protocol family Oct 30 00:20:16.771022 kernel: Key type dns_resolver registered Oct 30 00:20:16.771028 kernel: IPI shorthand broadcast: enabled Oct 30 00:20:16.771035 kernel: sched_clock: Marking stable (2622003803, 174362331)->(2809596769, -13230635) Oct 30 00:20:16.771043 kernel: registered taskstats version 1 Oct 30 00:20:16.771049 kernel: Loading compiled-in X.509 certificates Oct 30 00:20:16.771055 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.54-flatcar: 815fc40077fbc06b8d9e8a6016fea83aecff0a2a' Oct 30 00:20:16.771061 kernel: Demotion targets for Node 0: null Oct 30 00:20:16.771067 kernel: Key type .fscrypt registered Oct 30 00:20:16.771073 kernel: Key type fscrypt-provisioning registered Oct 30 00:20:16.771079 kernel: ima: No TPM chip found, activating TPM-bypass! Oct 30 00:20:16.771085 kernel: ima: Allocated hash algorithm: sha1 Oct 30 00:20:16.771093 kernel: ima: No architecture policies found Oct 30 00:20:16.771099 kernel: clk: Disabling unused clocks Oct 30 00:20:16.771105 kernel: Warning: unable to open an initial console. Oct 30 00:20:16.771112 kernel: Freeing unused kernel image (initmem) memory: 45544K Oct 30 00:20:16.771118 kernel: Write protecting the kernel read-only data: 40960k Oct 30 00:20:16.771133 kernel: Freeing unused kernel image (rodata/data gap) memory: 576K Oct 30 00:20:16.771141 kernel: Run /init as init process Oct 30 00:20:16.771147 kernel: with arguments: Oct 30 00:20:16.771154 kernel: /init Oct 30 00:20:16.771160 kernel: with environment: Oct 30 00:20:16.771169 kernel: HOME=/ Oct 30 00:20:16.771175 kernel: TERM=linux Oct 30 00:20:16.771182 systemd[1]: Successfully made /usr/ read-only. Oct 30 00:20:16.771191 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 30 00:20:16.771198 systemd[1]: Detected virtualization vmware. Oct 30 00:20:16.771204 systemd[1]: Detected architecture x86-64. Oct 30 00:20:16.771210 systemd[1]: Running in initrd. Oct 30 00:20:16.771218 systemd[1]: No hostname configured, using default hostname. Oct 30 00:20:16.771225 systemd[1]: Hostname set to . Oct 30 00:20:16.771231 systemd[1]: Initializing machine ID from random generator. Oct 30 00:20:16.771238 systemd[1]: Queued start job for default target initrd.target. Oct 30 00:20:16.771244 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 30 00:20:16.771251 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 30 00:20:16.771258 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Oct 30 00:20:16.771265 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 30 00:20:16.771273 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Oct 30 00:20:16.771280 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Oct 30 00:20:16.771287 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Oct 30 00:20:16.771293 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Oct 30 00:20:16.771300 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 30 00:20:16.771306 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 30 00:20:16.771313 systemd[1]: Reached target paths.target - Path Units. Oct 30 00:20:16.771321 systemd[1]: Reached target slices.target - Slice Units. Oct 30 00:20:16.771327 systemd[1]: Reached target swap.target - Swaps. Oct 30 00:20:16.771334 systemd[1]: Reached target timers.target - Timer Units. Oct 30 00:20:16.771341 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Oct 30 00:20:16.771347 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 30 00:20:16.771354 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Oct 30 00:20:16.771360 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Oct 30 00:20:16.771367 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 30 00:20:16.771374 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 30 00:20:16.771382 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 30 00:20:16.771388 systemd[1]: Reached target sockets.target - Socket Units. Oct 30 00:20:16.771395 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Oct 30 00:20:16.771401 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 30 00:20:16.771408 systemd[1]: Finished network-cleanup.service - Network Cleanup. Oct 30 00:20:16.771415 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Oct 30 00:20:16.771421 systemd[1]: Starting systemd-fsck-usr.service... Oct 30 00:20:16.771428 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 30 00:20:16.771435 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 30 00:20:16.771442 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 30 00:20:16.771448 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Oct 30 00:20:16.771455 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 30 00:20:16.771463 systemd[1]: Finished systemd-fsck-usr.service. Oct 30 00:20:16.771470 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 30 00:20:16.771492 systemd-journald[222]: Collecting audit messages is disabled. Oct 30 00:20:16.771510 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 30 00:20:16.771519 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 30 00:20:16.771526 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Oct 30 00:20:16.771532 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Oct 30 00:20:16.771539 kernel: Bridge firewalling registered Oct 30 00:20:16.771546 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 30 00:20:16.771552 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 30 00:20:16.771559 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 30 00:20:16.771566 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 30 00:20:16.771572 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 30 00:20:16.771580 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 30 00:20:16.771586 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Oct 30 00:20:16.771594 systemd-journald[222]: Journal started Oct 30 00:20:16.771609 systemd-journald[222]: Runtime Journal (/run/log/journal/ee40c33d11ce45318d0e8a712a05e2fa) is 4.8M, max 38.5M, 33.7M free. Oct 30 00:20:16.710707 systemd-modules-load[226]: Inserted module 'overlay' Oct 30 00:20:16.748464 systemd-modules-load[226]: Inserted module 'br_netfilter' Oct 30 00:20:16.775136 systemd[1]: Started systemd-journald.service - Journal Service. Oct 30 00:20:16.781809 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 30 00:20:16.788828 systemd-tmpfiles[263]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Oct 30 00:20:16.791236 dracut-cmdline[252]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e5fe4ef982f4bbc75df9f63e805c4ec086c6d95878919f55fe8c638c4d2b3b13 Oct 30 00:20:16.791510 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 30 00:20:16.793042 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 30 00:20:16.824833 systemd-resolved[280]: Positive Trust Anchors: Oct 30 00:20:16.825047 systemd-resolved[280]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 30 00:20:16.825071 systemd-resolved[280]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 30 00:20:16.826994 systemd-resolved[280]: Defaulting to hostname 'linux'. Oct 30 00:20:16.828334 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 30 00:20:16.828486 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 30 00:20:16.852140 kernel: SCSI subsystem initialized Oct 30 00:20:16.873151 kernel: Loading iSCSI transport class v2.0-870. Oct 30 00:20:16.880150 kernel: iscsi: registered transport (tcp) Oct 30 00:20:16.903142 kernel: iscsi: registered transport (qla4xxx) Oct 30 00:20:16.903189 kernel: QLogic iSCSI HBA Driver Oct 30 00:20:16.914157 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 30 00:20:16.924967 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 30 00:20:16.926026 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 30 00:20:16.948971 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Oct 30 00:20:16.950454 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Oct 30 00:20:16.995183 kernel: raid6: avx2x4 gen() 38094 MB/s Oct 30 00:20:17.012152 kernel: raid6: avx2x2 gen() 51774 MB/s Oct 30 00:20:17.029359 kernel: raid6: avx2x1 gen() 44270 MB/s Oct 30 00:20:17.029399 kernel: raid6: using algorithm avx2x2 gen() 51774 MB/s Oct 30 00:20:17.047364 kernel: raid6: .... xor() 31923 MB/s, rmw enabled Oct 30 00:20:17.047410 kernel: raid6: using avx2x2 recovery algorithm Oct 30 00:20:17.064150 kernel: xor: automatically using best checksumming function avx Oct 30 00:20:17.176154 kernel: Btrfs loaded, zoned=no, fsverity=no Oct 30 00:20:17.180250 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Oct 30 00:20:17.181694 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 30 00:20:17.199883 systemd-udevd[475]: Using default interface naming scheme 'v255'. Oct 30 00:20:17.203441 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 30 00:20:17.204440 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Oct 30 00:20:17.218293 dracut-pre-trigger[476]: rd.md=0: removing MD RAID activation Oct 30 00:20:17.233446 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Oct 30 00:20:17.234372 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 30 00:20:17.317685 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 30 00:20:17.319252 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Oct 30 00:20:17.394540 kernel: VMware PVSCSI driver - version 1.0.7.0-k Oct 30 00:20:17.394595 kernel: vmw_pvscsi: using 64bit dma Oct 30 00:20:17.405444 kernel: vmw_pvscsi: max_id: 16 Oct 30 00:20:17.405482 kernel: vmw_pvscsi: setting ring_pages to 8 Oct 30 00:20:17.411629 kernel: vmw_pvscsi: enabling reqCallThreshold Oct 30 00:20:17.411674 kernel: vmw_pvscsi: driver-based request coalescing enabled Oct 30 00:20:17.411689 kernel: vmw_pvscsi: using MSI-X Oct 30 00:20:17.414697 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Oct 30 00:20:17.414815 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Oct 30 00:20:17.419134 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Oct 30 00:20:17.426145 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI Oct 30 00:20:17.431136 kernel: cryptd: max_cpu_qlen set to 1000 Oct 30 00:20:17.433140 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Oct 30 00:20:17.435775 (udev-worker)[540]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Oct 30 00:20:17.440156 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Oct 30 00:20:17.442504 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 30 00:20:17.442590 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 30 00:20:17.446268 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Oct 30 00:20:17.447068 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 30 00:20:17.452840 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Oct 30 00:20:17.452964 kernel: sd 0:0:0:0: [sda] Write Protect is off Oct 30 00:20:17.453031 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Oct 30 00:20:17.453092 kernel: libata version 3.00 loaded. Oct 30 00:20:17.454643 kernel: sd 0:0:0:0: [sda] Cache data unavailable Oct 30 00:20:17.454724 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Oct 30 00:20:17.455130 kernel: AES CTR mode by8 optimization enabled Oct 30 00:20:17.475343 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 30 00:20:17.480284 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 30 00:20:17.480324 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Oct 30 00:20:17.488240 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Oct 30 00:20:17.493155 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 Oct 30 00:20:17.493195 kernel: ata_piix 0000:00:07.1: version 2.13 Oct 30 00:20:17.496733 kernel: scsi host1: ata_piix Oct 30 00:20:17.497016 kernel: scsi host2: ata_piix Oct 30 00:20:17.500377 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 Oct 30 00:20:17.500416 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 Oct 30 00:20:17.536639 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Oct 30 00:20:17.541874 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Oct 30 00:20:17.546178 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Oct 30 00:20:17.546318 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Oct 30 00:20:17.551765 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Oct 30 00:20:17.552392 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Oct 30 00:20:17.608509 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 30 00:20:17.622145 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 30 00:20:17.668161 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Oct 30 00:20:17.674161 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Oct 30 00:20:17.700540 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Oct 30 00:20:17.700678 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Oct 30 00:20:17.710140 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Oct 30 00:20:17.984227 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Oct 30 00:20:17.984563 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Oct 30 00:20:17.984692 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 30 00:20:17.984889 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 30 00:20:17.985528 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Oct 30 00:20:17.999682 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Oct 30 00:20:18.623025 disk-uuid[629]: The operation has completed successfully. Oct 30 00:20:18.623256 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Oct 30 00:20:18.667088 systemd[1]: disk-uuid.service: Deactivated successfully. Oct 30 00:20:18.667191 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Oct 30 00:20:18.678535 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Oct 30 00:20:18.692317 sh[661]: Success Oct 30 00:20:18.709873 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Oct 30 00:20:18.709907 kernel: device-mapper: uevent: version 1.0.3 Oct 30 00:20:18.709917 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Oct 30 00:20:18.717140 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Oct 30 00:20:18.754545 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Oct 30 00:20:18.757219 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Oct 30 00:20:18.766747 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Oct 30 00:20:18.781155 kernel: BTRFS: device fsid ad8523d8-35e6-44b9-a685-e8d871101da4 devid 1 transid 35 /dev/mapper/usr (254:0) scanned by mount (673) Oct 30 00:20:18.781212 kernel: BTRFS info (device dm-0): first mount of filesystem ad8523d8-35e6-44b9-a685-e8d871101da4 Oct 30 00:20:18.781227 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Oct 30 00:20:18.830468 kernel: BTRFS info (device dm-0): enabling ssd optimizations Oct 30 00:20:18.830524 kernel: BTRFS info (device dm-0): disabling log replay at mount time Oct 30 00:20:18.830535 kernel: BTRFS info (device dm-0): enabling free space tree Oct 30 00:20:18.833345 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Oct 30 00:20:18.833779 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Oct 30 00:20:18.834455 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Oct 30 00:20:18.836225 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Oct 30 00:20:18.861144 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (696) Oct 30 00:20:18.863831 kernel: BTRFS info (device sda6): first mount of filesystem 20cadb25-62ee-49b8-9ff8-7ba27828b77e Oct 30 00:20:18.863869 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 30 00:20:18.868188 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 30 00:20:18.868234 kernel: BTRFS info (device sda6): enabling free space tree Oct 30 00:20:18.873200 kernel: BTRFS info (device sda6): last unmount of filesystem 20cadb25-62ee-49b8-9ff8-7ba27828b77e Oct 30 00:20:18.877053 systemd[1]: Finished ignition-setup.service - Ignition (setup). Oct 30 00:20:18.877945 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Oct 30 00:20:18.940118 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Oct 30 00:20:18.940834 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Oct 30 00:20:19.013237 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 30 00:20:19.014322 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 30 00:20:19.043225 systemd-networkd[847]: lo: Link UP Oct 30 00:20:19.043489 systemd-networkd[847]: lo: Gained carrier Oct 30 00:20:19.044503 systemd-networkd[847]: Enumeration completed Oct 30 00:20:19.044704 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 30 00:20:19.044867 systemd[1]: Reached target network.target - Network. Oct 30 00:20:19.045384 systemd-networkd[847]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Oct 30 00:20:19.048861 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Oct 30 00:20:19.049029 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Oct 30 00:20:19.049725 systemd-networkd[847]: ens192: Link UP Oct 30 00:20:19.049865 systemd-networkd[847]: ens192: Gained carrier Oct 30 00:20:19.069580 ignition[715]: Ignition 2.22.0 Oct 30 00:20:19.069590 ignition[715]: Stage: fetch-offline Oct 30 00:20:19.069611 ignition[715]: no configs at "/usr/lib/ignition/base.d" Oct 30 00:20:19.069616 ignition[715]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 30 00:20:19.069669 ignition[715]: parsed url from cmdline: "" Oct 30 00:20:19.069671 ignition[715]: no config URL provided Oct 30 00:20:19.069675 ignition[715]: reading system config file "/usr/lib/ignition/user.ign" Oct 30 00:20:19.069679 ignition[715]: no config at "/usr/lib/ignition/user.ign" Oct 30 00:20:19.070048 ignition[715]: config successfully fetched Oct 30 00:20:19.070066 ignition[715]: parsing config with SHA512: 625589c332ebe93a7eef8a2fc5a912ed829997c6300816e2ad23caa5bcabb957c5bea09af0197049a4fd070efb1166c6ea14cd0d97fefa0a2ff79fce987baf69 Oct 30 00:20:19.072343 unknown[715]: fetched base config from "system" Oct 30 00:20:19.072351 unknown[715]: fetched user config from "vmware" Oct 30 00:20:19.072568 ignition[715]: fetch-offline: fetch-offline passed Oct 30 00:20:19.072600 ignition[715]: Ignition finished successfully Oct 30 00:20:19.073809 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Oct 30 00:20:19.074028 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Oct 30 00:20:19.074511 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Oct 30 00:20:19.098287 ignition[856]: Ignition 2.22.0 Oct 30 00:20:19.098592 ignition[856]: Stage: kargs Oct 30 00:20:19.098809 ignition[856]: no configs at "/usr/lib/ignition/base.d" Oct 30 00:20:19.098943 ignition[856]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 30 00:20:19.099791 ignition[856]: kargs: kargs passed Oct 30 00:20:19.099960 ignition[856]: Ignition finished successfully Oct 30 00:20:19.101362 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Oct 30 00:20:19.102296 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Oct 30 00:20:19.122529 ignition[862]: Ignition 2.22.0 Oct 30 00:20:19.122826 ignition[862]: Stage: disks Oct 30 00:20:19.123019 ignition[862]: no configs at "/usr/lib/ignition/base.d" Oct 30 00:20:19.123166 ignition[862]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 30 00:20:19.123764 ignition[862]: disks: disks passed Oct 30 00:20:19.123908 ignition[862]: Ignition finished successfully Oct 30 00:20:19.124939 systemd[1]: Finished ignition-disks.service - Ignition (disks). Oct 30 00:20:19.125179 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Oct 30 00:20:19.125299 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Oct 30 00:20:19.125490 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 30 00:20:19.125699 systemd[1]: Reached target sysinit.target - System Initialization. Oct 30 00:20:19.125870 systemd[1]: Reached target basic.target - Basic System. Oct 30 00:20:19.126573 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Oct 30 00:20:19.141852 systemd-fsck[870]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Oct 30 00:20:19.143356 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Oct 30 00:20:19.144336 systemd[1]: Mounting sysroot.mount - /sysroot... Oct 30 00:20:19.251147 kernel: EXT4-fs (sda9): mounted filesystem 02607114-2ead-44bc-a76e-2d51f82b108e r/w with ordered data mode. Quota mode: none. Oct 30 00:20:19.250941 systemd[1]: Mounted sysroot.mount - /sysroot. Oct 30 00:20:19.251355 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Oct 30 00:20:19.252486 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 30 00:20:19.254186 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Oct 30 00:20:19.254673 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Oct 30 00:20:19.254868 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Oct 30 00:20:19.254888 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Oct 30 00:20:19.263348 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Oct 30 00:20:19.264395 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Oct 30 00:20:19.313142 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (878) Oct 30 00:20:19.315367 kernel: BTRFS info (device sda6): first mount of filesystem 20cadb25-62ee-49b8-9ff8-7ba27828b77e Oct 30 00:20:19.315406 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 30 00:20:19.323146 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 30 00:20:19.323198 kernel: BTRFS info (device sda6): enabling free space tree Oct 30 00:20:19.324133 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 30 00:20:19.331131 initrd-setup-root[902]: cut: /sysroot/etc/passwd: No such file or directory Oct 30 00:20:19.333690 initrd-setup-root[909]: cut: /sysroot/etc/group: No such file or directory Oct 30 00:20:19.336498 initrd-setup-root[916]: cut: /sysroot/etc/shadow: No such file or directory Oct 30 00:20:19.338932 initrd-setup-root[923]: cut: /sysroot/etc/gshadow: No such file or directory Oct 30 00:20:19.433187 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Oct 30 00:20:19.433897 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Oct 30 00:20:19.435215 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Oct 30 00:20:19.451136 kernel: BTRFS info (device sda6): last unmount of filesystem 20cadb25-62ee-49b8-9ff8-7ba27828b77e Oct 30 00:20:19.464165 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Oct 30 00:20:19.470203 ignition[991]: INFO : Ignition 2.22.0 Oct 30 00:20:19.470537 ignition[991]: INFO : Stage: mount Oct 30 00:20:19.470770 ignition[991]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 30 00:20:19.470914 ignition[991]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 30 00:20:19.471680 ignition[991]: INFO : mount: mount passed Oct 30 00:20:19.471830 ignition[991]: INFO : Ignition finished successfully Oct 30 00:20:19.472941 systemd[1]: Finished ignition-mount.service - Ignition (mount). Oct 30 00:20:19.473689 systemd[1]: Starting ignition-files.service - Ignition (files)... Oct 30 00:20:19.777853 systemd[1]: sysroot-oem.mount: Deactivated successfully. Oct 30 00:20:19.778886 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 30 00:20:19.838141 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1003) Oct 30 00:20:19.840464 kernel: BTRFS info (device sda6): first mount of filesystem 20cadb25-62ee-49b8-9ff8-7ba27828b77e Oct 30 00:20:19.840490 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Oct 30 00:20:19.844189 kernel: BTRFS info (device sda6): enabling ssd optimizations Oct 30 00:20:19.844226 kernel: BTRFS info (device sda6): enabling free space tree Oct 30 00:20:19.845471 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 30 00:20:19.866631 ignition[1020]: INFO : Ignition 2.22.0 Oct 30 00:20:19.866631 ignition[1020]: INFO : Stage: files Oct 30 00:20:19.867017 ignition[1020]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 30 00:20:19.867017 ignition[1020]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 30 00:20:19.867362 ignition[1020]: DEBUG : files: compiled without relabeling support, skipping Oct 30 00:20:19.868202 ignition[1020]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Oct 30 00:20:19.868202 ignition[1020]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Oct 30 00:20:19.869565 ignition[1020]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Oct 30 00:20:19.869788 ignition[1020]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Oct 30 00:20:19.869948 unknown[1020]: wrote ssh authorized keys file for user: core Oct 30 00:20:19.870188 ignition[1020]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Oct 30 00:20:19.871935 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Oct 30 00:20:19.872134 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Oct 30 00:20:19.949226 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Oct 30 00:20:20.359760 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Oct 30 00:20:20.359760 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Oct 30 00:20:20.360191 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Oct 30 00:20:20.360191 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Oct 30 00:20:20.360191 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Oct 30 00:20:20.360191 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 30 00:20:20.360191 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 30 00:20:20.360191 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 30 00:20:20.360191 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 30 00:20:20.365551 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Oct 30 00:20:20.365731 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Oct 30 00:20:20.365731 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Oct 30 00:20:20.373100 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Oct 30 00:20:20.373100 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Oct 30 00:20:20.373552 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Oct 30 00:20:20.803233 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Oct 30 00:20:21.035758 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Oct 30 00:20:21.035758 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Oct 30 00:20:21.040661 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Oct 30 00:20:21.040661 ignition[1020]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Oct 30 00:20:21.040661 ignition[1020]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 30 00:20:21.044845 ignition[1020]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 30 00:20:21.045203 ignition[1020]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Oct 30 00:20:21.045203 ignition[1020]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Oct 30 00:20:21.045203 ignition[1020]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Oct 30 00:20:21.045203 ignition[1020]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Oct 30 00:20:21.045203 ignition[1020]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Oct 30 00:20:21.045203 ignition[1020]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Oct 30 00:20:21.054211 systemd-networkd[847]: ens192: Gained IPv6LL Oct 30 00:20:21.078143 ignition[1020]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Oct 30 00:20:21.081408 ignition[1020]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Oct 30 00:20:21.081931 ignition[1020]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Oct 30 00:20:21.081931 ignition[1020]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Oct 30 00:20:21.081931 ignition[1020]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Oct 30 00:20:21.081931 ignition[1020]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Oct 30 00:20:21.081931 ignition[1020]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Oct 30 00:20:21.081931 ignition[1020]: INFO : files: files passed Oct 30 00:20:21.081931 ignition[1020]: INFO : Ignition finished successfully Oct 30 00:20:21.083811 systemd[1]: Finished ignition-files.service - Ignition (files). Oct 30 00:20:21.084924 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Oct 30 00:20:21.087198 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Oct 30 00:20:21.095514 systemd[1]: ignition-quench.service: Deactivated successfully. Oct 30 00:20:21.095797 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Oct 30 00:20:21.099785 initrd-setup-root-after-ignition[1052]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 30 00:20:21.099785 initrd-setup-root-after-ignition[1052]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Oct 30 00:20:21.100888 initrd-setup-root-after-ignition[1056]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 30 00:20:21.101744 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 30 00:20:21.102388 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Oct 30 00:20:21.103284 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Oct 30 00:20:21.123215 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Oct 30 00:20:21.123289 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Oct 30 00:20:21.123554 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Oct 30 00:20:21.123828 systemd[1]: Reached target initrd.target - Initrd Default Target. Oct 30 00:20:21.124027 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Oct 30 00:20:21.124493 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Oct 30 00:20:21.133045 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 30 00:20:21.133781 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Oct 30 00:20:21.146725 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Oct 30 00:20:21.146961 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 30 00:20:21.147278 systemd[1]: Stopped target timers.target - Timer Units. Oct 30 00:20:21.147516 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Oct 30 00:20:21.147623 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 30 00:20:21.148047 systemd[1]: Stopped target initrd.target - Initrd Default Target. Oct 30 00:20:21.148302 systemd[1]: Stopped target basic.target - Basic System. Oct 30 00:20:21.148524 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Oct 30 00:20:21.148733 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Oct 30 00:20:21.148979 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Oct 30 00:20:21.149236 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Oct 30 00:20:21.149475 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Oct 30 00:20:21.149693 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Oct 30 00:20:21.149971 systemd[1]: Stopped target sysinit.target - System Initialization. Oct 30 00:20:21.150210 systemd[1]: Stopped target local-fs.target - Local File Systems. Oct 30 00:20:21.150429 systemd[1]: Stopped target swap.target - Swaps. Oct 30 00:20:21.150618 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Oct 30 00:20:21.150695 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Oct 30 00:20:21.151024 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Oct 30 00:20:21.151291 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 30 00:20:21.151522 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Oct 30 00:20:21.151576 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 30 00:20:21.151752 systemd[1]: dracut-initqueue.service: Deactivated successfully. Oct 30 00:20:21.151821 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Oct 30 00:20:21.152148 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Oct 30 00:20:21.152235 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Oct 30 00:20:21.152494 systemd[1]: Stopped target paths.target - Path Units. Oct 30 00:20:21.152644 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Oct 30 00:20:21.156178 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 30 00:20:21.156428 systemd[1]: Stopped target slices.target - Slice Units. Oct 30 00:20:21.156634 systemd[1]: Stopped target sockets.target - Socket Units. Oct 30 00:20:21.156847 systemd[1]: iscsid.socket: Deactivated successfully. Oct 30 00:20:21.156916 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Oct 30 00:20:21.157161 systemd[1]: iscsiuio.socket: Deactivated successfully. Oct 30 00:20:21.157208 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 30 00:20:21.157395 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Oct 30 00:20:21.157479 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 30 00:20:21.157739 systemd[1]: ignition-files.service: Deactivated successfully. Oct 30 00:20:21.157800 systemd[1]: Stopped ignition-files.service - Ignition (files). Oct 30 00:20:21.158497 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Oct 30 00:20:21.160216 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Oct 30 00:20:21.160345 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Oct 30 00:20:21.160417 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Oct 30 00:20:21.161299 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Oct 30 00:20:21.161365 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Oct 30 00:20:21.164041 systemd[1]: initrd-cleanup.service: Deactivated successfully. Oct 30 00:20:21.171376 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Oct 30 00:20:21.181567 ignition[1078]: INFO : Ignition 2.22.0 Oct 30 00:20:21.181825 ignition[1078]: INFO : Stage: umount Oct 30 00:20:21.181954 systemd[1]: sysroot-boot.mount: Deactivated successfully. Oct 30 00:20:21.182151 ignition[1078]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 30 00:20:21.182285 ignition[1078]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Oct 30 00:20:21.182990 ignition[1078]: INFO : umount: umount passed Oct 30 00:20:21.182990 ignition[1078]: INFO : Ignition finished successfully Oct 30 00:20:21.185075 systemd[1]: ignition-mount.service: Deactivated successfully. Oct 30 00:20:21.185258 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Oct 30 00:20:21.185753 systemd[1]: Stopped target network.target - Network. Oct 30 00:20:21.185990 systemd[1]: ignition-disks.service: Deactivated successfully. Oct 30 00:20:21.186151 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Oct 30 00:20:21.186383 systemd[1]: ignition-kargs.service: Deactivated successfully. Oct 30 00:20:21.186521 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Oct 30 00:20:21.186736 systemd[1]: ignition-setup.service: Deactivated successfully. Oct 30 00:20:21.186854 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Oct 30 00:20:21.187071 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Oct 30 00:20:21.187096 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Oct 30 00:20:21.187521 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Oct 30 00:20:21.187787 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Oct 30 00:20:21.189475 systemd[1]: systemd-resolved.service: Deactivated successfully. Oct 30 00:20:21.189668 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Oct 30 00:20:21.191090 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Oct 30 00:20:21.191396 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Oct 30 00:20:21.191546 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 30 00:20:21.192335 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Oct 30 00:20:21.197050 systemd[1]: systemd-networkd.service: Deactivated successfully. Oct 30 00:20:21.197118 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Oct 30 00:20:21.197856 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Oct 30 00:20:21.197968 systemd[1]: Stopped target network-pre.target - Preparation for Network. Oct 30 00:20:21.198103 systemd[1]: systemd-networkd.socket: Deactivated successfully. Oct 30 00:20:21.198119 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Oct 30 00:20:21.199218 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Oct 30 00:20:21.199333 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Oct 30 00:20:21.199359 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 30 00:20:21.199465 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Oct 30 00:20:21.199489 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Oct 30 00:20:21.199641 systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 30 00:20:21.199661 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Oct 30 00:20:21.201219 systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 30 00:20:21.201247 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Oct 30 00:20:21.201631 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 30 00:20:21.203884 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Oct 30 00:20:21.210525 systemd[1]: systemd-udevd.service: Deactivated successfully. Oct 30 00:20:21.212339 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 30 00:20:21.212595 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Oct 30 00:20:21.212615 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Oct 30 00:20:21.212769 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Oct 30 00:20:21.212786 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Oct 30 00:20:21.213023 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Oct 30 00:20:21.213046 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Oct 30 00:20:21.213461 systemd[1]: dracut-cmdline.service: Deactivated successfully. Oct 30 00:20:21.213486 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Oct 30 00:20:21.213991 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Oct 30 00:20:21.214014 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 30 00:20:21.215194 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Oct 30 00:20:21.215416 systemd[1]: systemd-network-generator.service: Deactivated successfully. Oct 30 00:20:21.215545 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Oct 30 00:20:21.215871 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Oct 30 00:20:21.215896 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 30 00:20:21.216362 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Oct 30 00:20:21.216495 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 30 00:20:21.216814 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Oct 30 00:20:21.216934 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Oct 30 00:20:21.217211 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 30 00:20:21.217334 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 30 00:20:21.217941 systemd[1]: network-cleanup.service: Deactivated successfully. Oct 30 00:20:21.224263 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Oct 30 00:20:21.227106 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Oct 30 00:20:21.227345 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Oct 30 00:20:21.288449 systemd[1]: sysroot-boot.service: Deactivated successfully. Oct 30 00:20:21.288532 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Oct 30 00:20:21.288990 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Oct 30 00:20:21.289760 systemd[1]: initrd-setup-root.service: Deactivated successfully. Oct 30 00:20:21.289808 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Oct 30 00:20:21.290809 systemd[1]: Starting initrd-switch-root.service - Switch Root... Oct 30 00:20:21.313681 systemd[1]: Switching root. Oct 30 00:20:21.362498 systemd-journald[222]: Journal stopped Oct 30 00:20:22.930387 systemd-journald[222]: Received SIGTERM from PID 1 (systemd). Oct 30 00:20:22.930414 kernel: SELinux: policy capability network_peer_controls=1 Oct 30 00:20:22.930423 kernel: SELinux: policy capability open_perms=1 Oct 30 00:20:22.930429 kernel: SELinux: policy capability extended_socket_class=1 Oct 30 00:20:22.930434 kernel: SELinux: policy capability always_check_network=0 Oct 30 00:20:22.930440 kernel: SELinux: policy capability cgroup_seclabel=1 Oct 30 00:20:22.930446 kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 30 00:20:22.930453 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Oct 30 00:20:22.930458 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Oct 30 00:20:22.930464 kernel: SELinux: policy capability userspace_initial_context=0 Oct 30 00:20:22.930470 kernel: audit: type=1403 audit(1761783622.160:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Oct 30 00:20:22.930476 systemd[1]: Successfully loaded SELinux policy in 67.344ms. Oct 30 00:20:22.930483 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.863ms. Oct 30 00:20:22.930492 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 30 00:20:22.930499 systemd[1]: Detected virtualization vmware. Oct 30 00:20:22.930507 systemd[1]: Detected architecture x86-64. Oct 30 00:20:22.930513 systemd[1]: Detected first boot. Oct 30 00:20:22.930520 systemd[1]: Initializing machine ID from random generator. Oct 30 00:20:22.930528 zram_generator::config[1122]: No configuration found. Oct 30 00:20:22.930619 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Oct 30 00:20:22.930630 kernel: Guest personality initialized and is active Oct 30 00:20:22.930636 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Oct 30 00:20:22.930643 kernel: Initialized host personality Oct 30 00:20:22.930649 kernel: NET: Registered PF_VSOCK protocol family Oct 30 00:20:22.930657 systemd[1]: Populated /etc with preset unit settings. Oct 30 00:20:22.930665 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 30 00:20:22.930672 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Oct 30 00:20:22.930679 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Oct 30 00:20:22.930686 systemd[1]: initrd-switch-root.service: Deactivated successfully. Oct 30 00:20:22.930693 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Oct 30 00:20:22.930699 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Oct 30 00:20:22.930707 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Oct 30 00:20:22.930714 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Oct 30 00:20:22.930721 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Oct 30 00:20:22.930728 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Oct 30 00:20:22.930735 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Oct 30 00:20:22.930741 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Oct 30 00:20:22.930748 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Oct 30 00:20:22.930755 systemd[1]: Created slice user.slice - User and Session Slice. Oct 30 00:20:22.930763 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 30 00:20:22.930771 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 30 00:20:22.930779 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Oct 30 00:20:22.930786 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Oct 30 00:20:22.930793 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Oct 30 00:20:22.930800 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 30 00:20:22.930807 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Oct 30 00:20:22.930815 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 30 00:20:22.930822 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 30 00:20:22.930829 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Oct 30 00:20:22.930836 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Oct 30 00:20:22.930844 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Oct 30 00:20:22.930850 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Oct 30 00:20:22.930857 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 30 00:20:22.930864 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 30 00:20:22.930871 systemd[1]: Reached target slices.target - Slice Units. Oct 30 00:20:22.930879 systemd[1]: Reached target swap.target - Swaps. Oct 30 00:20:22.930886 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Oct 30 00:20:22.930893 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Oct 30 00:20:22.930900 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Oct 30 00:20:22.930909 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 30 00:20:22.930916 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 30 00:20:22.930925 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 30 00:20:22.930933 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Oct 30 00:20:22.930940 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Oct 30 00:20:22.930947 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Oct 30 00:20:22.930954 systemd[1]: Mounting media.mount - External Media Directory... Oct 30 00:20:22.930961 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 30 00:20:22.930968 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Oct 30 00:20:22.930976 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Oct 30 00:20:22.930983 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Oct 30 00:20:22.930990 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Oct 30 00:20:22.930997 systemd[1]: Reached target machines.target - Containers. Oct 30 00:20:22.931004 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Oct 30 00:20:22.931010 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Oct 30 00:20:22.931017 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 30 00:20:22.931024 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Oct 30 00:20:22.931032 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 30 00:20:22.931039 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 30 00:20:22.931046 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 30 00:20:22.931053 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Oct 30 00:20:22.931060 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 30 00:20:22.931067 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Oct 30 00:20:22.931074 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Oct 30 00:20:22.931082 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Oct 30 00:20:22.931089 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Oct 30 00:20:22.931097 systemd[1]: Stopped systemd-fsck-usr.service. Oct 30 00:20:22.931104 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 30 00:20:22.931111 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 30 00:20:22.931118 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 30 00:20:22.936258 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 30 00:20:22.936271 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Oct 30 00:20:22.936279 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Oct 30 00:20:22.936286 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 30 00:20:22.936297 systemd[1]: verity-setup.service: Deactivated successfully. Oct 30 00:20:22.936304 systemd[1]: Stopped verity-setup.service. Oct 30 00:20:22.936311 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 30 00:20:22.936319 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Oct 30 00:20:22.936326 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Oct 30 00:20:22.936333 systemd[1]: Mounted media.mount - External Media Directory. Oct 30 00:20:22.936340 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Oct 30 00:20:22.936346 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Oct 30 00:20:22.936355 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Oct 30 00:20:22.936362 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 30 00:20:22.936369 systemd[1]: modprobe@configfs.service: Deactivated successfully. Oct 30 00:20:22.936376 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Oct 30 00:20:22.936383 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 30 00:20:22.936390 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 30 00:20:22.936396 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 30 00:20:22.936403 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 30 00:20:22.936410 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Oct 30 00:20:22.936418 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 30 00:20:22.936426 kernel: loop: module loaded Oct 30 00:20:22.936433 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 30 00:20:22.936439 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 30 00:20:22.936446 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 30 00:20:22.936453 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Oct 30 00:20:22.936460 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 30 00:20:22.936467 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Oct 30 00:20:22.936478 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Oct 30 00:20:22.936485 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 30 00:20:22.936494 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Oct 30 00:20:22.936501 kernel: fuse: init (API version 7.41) Oct 30 00:20:22.936508 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Oct 30 00:20:22.936515 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 30 00:20:22.936522 kernel: ACPI: bus type drm_connector registered Oct 30 00:20:22.936549 systemd-journald[1208]: Collecting audit messages is disabled. Oct 30 00:20:22.936567 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Oct 30 00:20:22.936575 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 30 00:20:22.936584 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Oct 30 00:20:22.936591 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 30 00:20:22.936599 systemd-journald[1208]: Journal started Oct 30 00:20:22.936617 systemd-journald[1208]: Runtime Journal (/run/log/journal/d7e59173295a42ceadb323815de8b29c) is 4.8M, max 38.5M, 33.7M free. Oct 30 00:20:22.716836 systemd[1]: Queued start job for default target multi-user.target. Oct 30 00:20:22.931234 ignition[1222]: Ignition 2.22.0 Oct 30 00:20:22.723238 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Oct 30 00:20:22.931407 ignition[1222]: deleting config from guestinfo properties Oct 30 00:20:22.723467 systemd[1]: systemd-journald.service: Deactivated successfully. Oct 30 00:20:22.938056 ignition[1222]: Successfully deleted config Oct 30 00:20:22.943684 jq[1192]: true Oct 30 00:20:22.945297 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 30 00:20:22.945531 jq[1220]: true Oct 30 00:20:22.949333 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Oct 30 00:20:22.957700 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 30 00:20:22.957743 systemd[1]: Started systemd-journald.service - Journal Service. Oct 30 00:20:22.956778 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 30 00:20:22.956897 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 30 00:20:22.957162 systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 30 00:20:22.957280 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Oct 30 00:20:22.958287 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Oct 30 00:20:22.958948 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Oct 30 00:20:22.962534 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Oct 30 00:20:22.971614 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Oct 30 00:20:22.974925 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Oct 30 00:20:22.979924 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Oct 30 00:20:22.987184 kernel: loop0: detected capacity change from 0 to 2960 Oct 30 00:20:22.986888 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Oct 30 00:20:22.997157 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 30 00:20:23.002395 systemd-journald[1208]: Time spent on flushing to /var/log/journal/d7e59173295a42ceadb323815de8b29c is 37.924ms for 1767 entries. Oct 30 00:20:23.002395 systemd-journald[1208]: System Journal (/var/log/journal/d7e59173295a42ceadb323815de8b29c) is 8M, max 584.8M, 576.8M free. Oct 30 00:20:23.070333 systemd-journald[1208]: Received client request to flush runtime journal. Oct 30 00:20:23.070368 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Oct 30 00:20:23.070383 kernel: loop1: detected capacity change from 0 to 128016 Oct 30 00:20:23.004138 systemd-tmpfiles[1249]: ACLs are not supported, ignoring. Oct 30 00:20:23.004150 systemd-tmpfiles[1249]: ACLs are not supported, ignoring. Oct 30 00:20:23.008669 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Oct 30 00:20:23.013155 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 30 00:20:23.016315 systemd[1]: Starting systemd-sysusers.service - Create System Users... Oct 30 00:20:23.072361 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Oct 30 00:20:23.083164 systemd[1]: Finished systemd-sysusers.service - Create System Users. Oct 30 00:20:23.087093 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 30 00:20:23.087179 kernel: loop2: detected capacity change from 0 to 229808 Oct 30 00:20:23.114945 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Oct 30 00:20:23.114957 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Oct 30 00:20:23.119451 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 30 00:20:23.121260 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 30 00:20:23.129313 kernel: loop3: detected capacity change from 0 to 110984 Oct 30 00:20:23.179143 kernel: loop4: detected capacity change from 0 to 2960 Oct 30 00:20:23.220275 kernel: loop5: detected capacity change from 0 to 128016 Oct 30 00:20:23.244140 kernel: loop6: detected capacity change from 0 to 229808 Oct 30 00:20:23.262147 kernel: loop7: detected capacity change from 0 to 110984 Oct 30 00:20:23.278512 (sd-merge)[1299]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Oct 30 00:20:23.279894 (sd-merge)[1299]: Merged extensions into '/usr'. Oct 30 00:20:23.285174 systemd[1]: Reload requested from client PID 1248 ('systemd-sysext') (unit systemd-sysext.service)... Oct 30 00:20:23.285184 systemd[1]: Reloading... Oct 30 00:20:23.336140 zram_generator::config[1324]: No configuration found. Oct 30 00:20:23.492910 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 30 00:20:23.534677 ldconfig[1237]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Oct 30 00:20:23.546108 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Oct 30 00:20:23.546242 systemd[1]: Reloading finished in 260 ms. Oct 30 00:20:23.575345 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Oct 30 00:20:23.575674 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Oct 30 00:20:23.575954 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Oct 30 00:20:23.582038 systemd[1]: Starting ensure-sysext.service... Oct 30 00:20:23.585065 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 30 00:20:23.586466 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 30 00:20:23.592179 systemd[1]: Reload requested from client PID 1383 ('systemctl') (unit ensure-sysext.service)... Oct 30 00:20:23.592186 systemd[1]: Reloading... Oct 30 00:20:23.603650 systemd-tmpfiles[1384]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Oct 30 00:20:23.603669 systemd-tmpfiles[1384]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Oct 30 00:20:23.603811 systemd-tmpfiles[1384]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Oct 30 00:20:23.603985 systemd-tmpfiles[1384]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Oct 30 00:20:23.604462 systemd-tmpfiles[1384]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Oct 30 00:20:23.604628 systemd-tmpfiles[1384]: ACLs are not supported, ignoring. Oct 30 00:20:23.604662 systemd-tmpfiles[1384]: ACLs are not supported, ignoring. Oct 30 00:20:23.608203 systemd-udevd[1385]: Using default interface naming scheme 'v255'. Oct 30 00:20:23.620168 systemd-tmpfiles[1384]: Detected autofs mount point /boot during canonicalization of boot. Oct 30 00:20:23.620173 systemd-tmpfiles[1384]: Skipping /boot Oct 30 00:20:23.626177 systemd-tmpfiles[1384]: Detected autofs mount point /boot during canonicalization of boot. Oct 30 00:20:23.626182 systemd-tmpfiles[1384]: Skipping /boot Oct 30 00:20:23.630140 zram_generator::config[1407]: No configuration found. Oct 30 00:20:23.753949 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 30 00:20:23.821987 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Oct 30 00:20:23.822295 kernel: mousedev: PS/2 mouse device common for all mice Oct 30 00:20:23.822255 systemd[1]: Reloading finished in 229 ms. Oct 30 00:20:23.824156 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Oct 30 00:20:23.829453 kernel: ACPI: button: Power Button [PWRF] Oct 30 00:20:23.837137 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 30 00:20:23.841644 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 30 00:20:23.850108 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 30 00:20:23.854287 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Oct 30 00:20:23.856308 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Oct 30 00:20:23.859637 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Oct 30 00:20:23.868266 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 30 00:20:23.871885 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 30 00:20:23.873497 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Oct 30 00:20:23.873990 systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 30 00:20:23.874170 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Oct 30 00:20:23.889697 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Oct 30 00:20:23.892993 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 30 00:20:23.895227 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 30 00:20:23.901708 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 30 00:20:23.906013 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Oct 30 00:20:23.909484 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 30 00:20:23.909685 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 30 00:20:23.909820 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 30 00:20:23.911617 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Oct 30 00:20:23.911773 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 30 00:20:23.915628 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Oct 30 00:20:23.915890 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Oct 30 00:20:23.925234 systemd[1]: Starting systemd-update-done.service - Update is Completed... Oct 30 00:20:23.928158 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Oct 30 00:20:23.931534 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 30 00:20:23.931645 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 30 00:20:23.931706 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 30 00:20:23.931759 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 30 00:20:23.933071 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Oct 30 00:20:23.933575 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 30 00:20:23.935990 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 30 00:20:23.939491 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 30 00:20:23.939793 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 30 00:20:23.939897 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 30 00:20:23.940176 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 30 00:20:23.940234 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 30 00:20:23.941186 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 30 00:20:23.941315 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 30 00:20:23.951260 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Oct 30 00:20:23.951952 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 30 00:20:23.956212 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 30 00:20:23.960230 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Oct 30 00:20:23.960338 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 30 00:20:23.960608 systemd[1]: Finished ensure-sysext.service. Oct 30 00:20:23.965210 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Oct 30 00:20:23.975816 augenrules[1550]: No rules Oct 30 00:20:23.975113 systemd[1]: audit-rules.service: Deactivated successfully. Oct 30 00:20:23.979070 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 30 00:20:23.979362 systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 30 00:20:23.980153 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Oct 30 00:20:23.980434 systemd[1]: Finished systemd-update-done.service - Update is Completed. Oct 30 00:20:23.985730 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 30 00:20:23.989758 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Oct 30 00:20:23.989973 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 30 00:20:23.990254 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 30 00:20:23.993990 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 30 00:20:23.994177 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 30 00:20:24.000513 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Oct 30 00:20:24.015961 systemd[1]: Started systemd-userdbd.service - User Database Manager. Oct 30 00:20:24.042662 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 30 00:20:24.052811 (udev-worker)[1435]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Oct 30 00:20:24.123106 systemd-networkd[1507]: lo: Link UP Oct 30 00:20:24.123111 systemd-networkd[1507]: lo: Gained carrier Oct 30 00:20:24.124084 systemd-networkd[1507]: Enumeration completed Oct 30 00:20:24.124152 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 30 00:20:24.126245 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Oct 30 00:20:24.128895 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Oct 30 00:20:24.134426 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Oct 30 00:20:24.134608 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Oct 30 00:20:24.131432 systemd-networkd[1507]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Oct 30 00:20:24.141520 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 30 00:20:24.142083 systemd-networkd[1507]: ens192: Link UP Oct 30 00:20:24.142182 systemd-networkd[1507]: ens192: Gained carrier Oct 30 00:20:24.143467 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Oct 30 00:20:24.143610 systemd[1]: Reached target time-set.target - System Time Set. Oct 30 00:20:24.146618 systemd-timesyncd[1544]: Network configuration changed, trying to establish connection. Oct 30 00:20:24.149882 systemd-resolved[1508]: Positive Trust Anchors: Oct 30 00:20:24.149892 systemd-resolved[1508]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 30 00:20:24.149916 systemd-resolved[1508]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 30 00:20:24.156972 systemd-resolved[1508]: Defaulting to hostname 'linux'. Oct 30 00:20:24.157857 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 30 00:20:24.158094 systemd[1]: Reached target network.target - Network. Oct 30 00:20:24.158552 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 30 00:20:24.158663 systemd[1]: Reached target sysinit.target - System Initialization. Oct 30 00:20:24.158859 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Oct 30 00:20:24.158978 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Oct 30 00:20:24.159083 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Oct 30 00:20:24.159311 systemd[1]: Started logrotate.timer - Daily rotation of log files. Oct 30 00:20:24.159446 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Oct 30 00:20:24.159549 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Oct 30 00:20:24.159658 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Oct 30 00:20:24.159675 systemd[1]: Reached target paths.target - Path Units. Oct 30 00:20:24.159758 systemd[1]: Reached target timers.target - Timer Units. Oct 30 00:20:24.160297 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Oct 30 00:20:24.161484 systemd[1]: Starting docker.socket - Docker Socket for the API... Oct 30 00:20:24.162767 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Oct 30 00:20:24.162946 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Oct 30 00:20:24.163057 systemd[1]: Reached target ssh-access.target - SSH Access Available. Oct 30 00:20:24.166193 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Oct 30 00:20:24.166451 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Oct 30 00:20:24.167073 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Oct 30 00:20:24.167266 systemd[1]: Listening on docker.socket - Docker Socket for the API. Oct 30 00:20:24.167953 systemd[1]: Reached target sockets.target - Socket Units. Oct 30 00:20:24.168055 systemd[1]: Reached target basic.target - Basic System. Oct 30 00:20:24.168197 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Oct 30 00:20:24.168221 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Oct 30 00:20:24.168953 systemd[1]: Starting containerd.service - containerd container runtime... Oct 30 00:20:24.171310 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Oct 30 00:20:24.172616 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Oct 30 00:20:24.174216 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Oct 30 00:20:24.174925 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Oct 30 00:20:24.175209 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Oct 30 00:20:24.177265 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Oct 30 00:22:02.025854 systemd-timesyncd[1544]: Contacted time server 51.81.209.232:123 (0.flatcar.pool.ntp.org). Oct 30 00:22:02.025898 systemd-resolved[1508]: Clock change detected. Flushing caches. Oct 30 00:22:02.025969 systemd-timesyncd[1544]: Initial clock synchronization to Thu 2025-10-30 00:22:02.025805 UTC. Oct 30 00:22:02.027538 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Oct 30 00:22:02.029107 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Oct 30 00:22:02.029862 jq[1596]: false Oct 30 00:22:02.031162 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Oct 30 00:22:02.033948 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Oct 30 00:22:02.036251 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Refreshing passwd entry cache Oct 30 00:22:02.036424 oslogin_cache_refresh[1598]: Refreshing passwd entry cache Oct 30 00:22:02.037251 systemd[1]: Starting systemd-logind.service - User Login Management... Oct 30 00:22:02.037825 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Oct 30 00:22:02.040941 extend-filesystems[1597]: Found /dev/sda6 Oct 30 00:22:02.041468 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Oct 30 00:22:02.042070 systemd[1]: Starting update-engine.service - Update Engine... Oct 30 00:22:02.043250 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Failure getting users, quitting Oct 30 00:22:02.043250 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 30 00:22:02.043250 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Refreshing group entry cache Oct 30 00:22:02.042990 oslogin_cache_refresh[1598]: Failure getting users, quitting Oct 30 00:22:02.043000 oslogin_cache_refresh[1598]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 30 00:22:02.043026 oslogin_cache_refresh[1598]: Refreshing group entry cache Oct 30 00:22:02.044164 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Oct 30 00:22:02.048105 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Oct 30 00:22:02.048126 oslogin_cache_refresh[1598]: Failure getting groups, quitting Oct 30 00:22:02.048606 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Failure getting groups, quitting Oct 30 00:22:02.048606 google_oslogin_nss_cache[1598]: oslogin_cache_refresh[1598]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 30 00:22:02.048132 oslogin_cache_refresh[1598]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 30 00:22:02.051153 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Oct 30 00:22:02.051386 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Oct 30 00:22:02.052055 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Oct 30 00:22:02.052210 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Oct 30 00:22:02.052327 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Oct 30 00:22:02.054582 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Oct 30 00:22:02.054713 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Oct 30 00:22:02.059900 extend-filesystems[1597]: Found /dev/sda9 Oct 30 00:22:02.066544 extend-filesystems[1597]: Checking size of /dev/sda9 Oct 30 00:22:02.069691 jq[1609]: true Oct 30 00:22:02.077376 systemd[1]: motdgen.service: Deactivated successfully. Oct 30 00:22:02.080694 update_engine[1608]: I20251030 00:22:02.080651 1608 main.cc:92] Flatcar Update Engine starting Oct 30 00:22:02.081179 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Oct 30 00:22:02.081242 (ntainerd)[1627]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Oct 30 00:22:02.086623 extend-filesystems[1597]: Old size kept for /dev/sda9 Oct 30 00:22:02.086963 tar[1617]: linux-amd64/LICENSE Oct 30 00:22:02.086963 tar[1617]: linux-amd64/helm Oct 30 00:22:02.087414 systemd[1]: extend-filesystems.service: Deactivated successfully. Oct 30 00:22:02.087553 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Oct 30 00:22:02.096537 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Oct 30 00:22:02.099973 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Oct 30 00:22:02.104268 jq[1633]: true Oct 30 00:22:02.129680 dbus-daemon[1594]: [system] SELinux support is enabled Oct 30 00:22:02.129788 systemd[1]: Started dbus.service - D-Bus System Message Bus. Oct 30 00:22:02.131466 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Oct 30 00:22:02.131494 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Oct 30 00:22:02.131624 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Oct 30 00:22:02.131637 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Oct 30 00:22:02.145913 systemd[1]: Started update-engine.service - Update Engine. Oct 30 00:22:02.150143 update_engine[1608]: I20251030 00:22:02.149422 1608 update_check_scheduler.cc:74] Next update check in 7m51s Oct 30 00:22:02.152753 systemd[1]: Started locksmithd.service - Cluster reboot manager. Oct 30 00:22:02.168117 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Oct 30 00:22:02.181503 systemd-logind[1606]: Watching system buttons on /dev/input/event2 (Power Button) Oct 30 00:22:02.182195 unknown[1642]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Oct 30 00:22:02.182664 bash[1661]: Updated "/home/core/.ssh/authorized_keys" Oct 30 00:22:02.183483 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Oct 30 00:22:02.183929 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Oct 30 00:22:02.184047 systemd-logind[1606]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Oct 30 00:22:02.184626 systemd-logind[1606]: New seat seat0. Oct 30 00:22:02.184672 unknown[1642]: Core dump limit set to -1 Oct 30 00:22:02.187807 systemd[1]: Started systemd-logind.service - User Login Management. Oct 30 00:22:02.302547 sshd_keygen[1634]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Oct 30 00:22:02.343451 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Oct 30 00:22:02.346090 systemd[1]: Starting issuegen.service - Generate /run/issue... Oct 30 00:22:02.355766 locksmithd[1657]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Oct 30 00:22:02.367412 systemd[1]: issuegen.service: Deactivated successfully. Oct 30 00:22:02.368274 systemd[1]: Finished issuegen.service - Generate /run/issue. Oct 30 00:22:02.371641 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Oct 30 00:22:02.392299 containerd[1627]: time="2025-10-30T00:22:02Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Oct 30 00:22:02.393043 containerd[1627]: time="2025-10-30T00:22:02.392825974Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Oct 30 00:22:02.396358 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Oct 30 00:22:02.398202 systemd[1]: Started getty@tty1.service - Getty on tty1. Oct 30 00:22:02.400605 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Oct 30 00:22:02.401220 systemd[1]: Reached target getty.target - Login Prompts. Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403364661Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.917µs" Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403389465Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403403482Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403494435Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403524536Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403546415Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403600290Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403610083Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403742109Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403750831Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403760563Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404323 containerd[1627]: time="2025-10-30T00:22:02.403767416Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404494 containerd[1627]: time="2025-10-30T00:22:02.403808655Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404494 containerd[1627]: time="2025-10-30T00:22:02.403917736Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404494 containerd[1627]: time="2025-10-30T00:22:02.403937925Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 30 00:22:02.404494 containerd[1627]: time="2025-10-30T00:22:02.403944554Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Oct 30 00:22:02.404494 containerd[1627]: time="2025-10-30T00:22:02.403958729Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Oct 30 00:22:02.407260 containerd[1627]: time="2025-10-30T00:22:02.407246967Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Oct 30 00:22:02.407970 containerd[1627]: time="2025-10-30T00:22:02.407956993Z" level=info msg="metadata content store policy set" policy=shared Oct 30 00:22:02.409975 containerd[1627]: time="2025-10-30T00:22:02.409955326Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Oct 30 00:22:02.410008 containerd[1627]: time="2025-10-30T00:22:02.409983245Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Oct 30 00:22:02.410008 containerd[1627]: time="2025-10-30T00:22:02.409992368Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Oct 30 00:22:02.410008 containerd[1627]: time="2025-10-30T00:22:02.410002722Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Oct 30 00:22:02.410085 containerd[1627]: time="2025-10-30T00:22:02.410009723Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Oct 30 00:22:02.410085 containerd[1627]: time="2025-10-30T00:22:02.410015356Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Oct 30 00:22:02.410085 containerd[1627]: time="2025-10-30T00:22:02.410021475Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Oct 30 00:22:02.410085 containerd[1627]: time="2025-10-30T00:22:02.410027547Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Oct 30 00:22:02.410085 containerd[1627]: time="2025-10-30T00:22:02.410041826Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Oct 30 00:22:02.410085 containerd[1627]: time="2025-10-30T00:22:02.410047801Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Oct 30 00:22:02.410085 containerd[1627]: time="2025-10-30T00:22:02.410053018Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Oct 30 00:22:02.410085 containerd[1627]: time="2025-10-30T00:22:02.410059689Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410115800Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410131783Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410141826Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410149975Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410155466Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410161848Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410170147Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410175645Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410182009Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410188184Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Oct 30 00:22:02.410210 containerd[1627]: time="2025-10-30T00:22:02.410193503Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Oct 30 00:22:02.410399 containerd[1627]: time="2025-10-30T00:22:02.410228474Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Oct 30 00:22:02.410399 containerd[1627]: time="2025-10-30T00:22:02.410237233Z" level=info msg="Start snapshots syncer" Oct 30 00:22:02.410399 containerd[1627]: time="2025-10-30T00:22:02.410249008Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Oct 30 00:22:02.410436 containerd[1627]: time="2025-10-30T00:22:02.410380995Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Oct 30 00:22:02.410436 containerd[1627]: time="2025-10-30T00:22:02.410419012Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Oct 30 00:22:02.410591 containerd[1627]: time="2025-10-30T00:22:02.410458643Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Oct 30 00:22:02.410591 containerd[1627]: time="2025-10-30T00:22:02.410525384Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Oct 30 00:22:02.410591 containerd[1627]: time="2025-10-30T00:22:02.410537421Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Oct 30 00:22:02.410591 containerd[1627]: time="2025-10-30T00:22:02.410563377Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Oct 30 00:22:02.410591 containerd[1627]: time="2025-10-30T00:22:02.410570368Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Oct 30 00:22:02.410591 containerd[1627]: time="2025-10-30T00:22:02.410576452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Oct 30 00:22:02.410591 containerd[1627]: time="2025-10-30T00:22:02.410582297Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Oct 30 00:22:02.410591 containerd[1627]: time="2025-10-30T00:22:02.410587650Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410599472Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410605550Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410611087Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410629126Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410637544Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410642077Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410646843Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410651013Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410655397Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410660731Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410669787Z" level=info msg="runtime interface created" Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410672591Z" level=info msg="created NRI interface" Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410677147Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410683151Z" level=info msg="Connect containerd service" Oct 30 00:22:02.410760 containerd[1627]: time="2025-10-30T00:22:02.410696571Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Oct 30 00:22:02.412673 containerd[1627]: time="2025-10-30T00:22:02.412657476Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Oct 30 00:22:02.491099 containerd[1627]: time="2025-10-30T00:22:02.491053486Z" level=info msg="Start subscribing containerd event" Oct 30 00:22:02.491176 containerd[1627]: time="2025-10-30T00:22:02.491145231Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Oct 30 00:22:02.491176 containerd[1627]: time="2025-10-30T00:22:02.491173450Z" level=info msg=serving... address=/run/containerd/containerd.sock Oct 30 00:22:02.491376 containerd[1627]: time="2025-10-30T00:22:02.491227916Z" level=info msg="Start recovering state" Oct 30 00:22:02.491376 containerd[1627]: time="2025-10-30T00:22:02.491300795Z" level=info msg="Start event monitor" Oct 30 00:22:02.491376 containerd[1627]: time="2025-10-30T00:22:02.491313064Z" level=info msg="Start cni network conf syncer for default" Oct 30 00:22:02.491376 containerd[1627]: time="2025-10-30T00:22:02.491317385Z" level=info msg="Start streaming server" Oct 30 00:22:02.491376 containerd[1627]: time="2025-10-30T00:22:02.491323656Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Oct 30 00:22:02.491376 containerd[1627]: time="2025-10-30T00:22:02.491327956Z" level=info msg="runtime interface starting up..." Oct 30 00:22:02.491376 containerd[1627]: time="2025-10-30T00:22:02.491330846Z" level=info msg="starting plugins..." Oct 30 00:22:02.491376 containerd[1627]: time="2025-10-30T00:22:02.491345304Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Oct 30 00:22:02.491654 containerd[1627]: time="2025-10-30T00:22:02.491587200Z" level=info msg="containerd successfully booted in 0.099578s" Oct 30 00:22:02.491648 systemd[1]: Started containerd.service - containerd container runtime. Oct 30 00:22:02.520384 tar[1617]: linux-amd64/README.md Oct 30 00:22:02.543369 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Oct 30 00:22:03.060189 systemd-networkd[1507]: ens192: Gained IPv6LL Oct 30 00:22:03.061352 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Oct 30 00:22:03.062036 systemd[1]: Reached target network-online.target - Network is Online. Oct 30 00:22:03.063090 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Oct 30 00:22:03.065319 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:22:03.069181 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Oct 30 00:22:03.101774 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Oct 30 00:22:03.102172 systemd[1]: coreos-metadata.service: Deactivated successfully. Oct 30 00:22:03.102753 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Oct 30 00:22:03.103413 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Oct 30 00:22:04.062654 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:22:04.063096 systemd[1]: Reached target multi-user.target - Multi-User System. Oct 30 00:22:04.063767 systemd[1]: Startup finished in 2.655s (kernel) + 5.554s (initrd) + 4.121s (userspace) = 12.331s. Oct 30 00:22:04.074322 (kubelet)[1791]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 30 00:22:04.117134 login[1719]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 30 00:22:04.118015 login[1720]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 30 00:22:04.122793 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Oct 30 00:22:04.124040 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Oct 30 00:22:04.129662 systemd-logind[1606]: New session 2 of user core. Oct 30 00:22:04.132925 systemd-logind[1606]: New session 1 of user core. Oct 30 00:22:04.139865 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Oct 30 00:22:04.143169 systemd[1]: Starting user@500.service - User Manager for UID 500... Oct 30 00:22:04.149373 (systemd)[1798]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Oct 30 00:22:04.151004 systemd-logind[1606]: New session c1 of user core. Oct 30 00:22:04.237419 systemd[1798]: Queued start job for default target default.target. Oct 30 00:22:04.248808 systemd[1798]: Created slice app.slice - User Application Slice. Oct 30 00:22:04.248824 systemd[1798]: Reached target paths.target - Paths. Oct 30 00:22:04.248847 systemd[1798]: Reached target timers.target - Timers. Oct 30 00:22:04.252082 systemd[1798]: Starting dbus.socket - D-Bus User Message Bus Socket... Oct 30 00:22:04.256855 systemd[1798]: Listening on dbus.socket - D-Bus User Message Bus Socket. Oct 30 00:22:04.256942 systemd[1798]: Reached target sockets.target - Sockets. Oct 30 00:22:04.257008 systemd[1798]: Reached target basic.target - Basic System. Oct 30 00:22:04.257041 systemd[1798]: Reached target default.target - Main User Target. Oct 30 00:22:04.257058 systemd[1798]: Startup finished in 101ms. Oct 30 00:22:04.257236 systemd[1]: Started user@500.service - User Manager for UID 500. Oct 30 00:22:04.258247 systemd[1]: Started session-1.scope - Session 1 of User core. Oct 30 00:22:04.262072 systemd[1]: Started session-2.scope - Session 2 of User core. Oct 30 00:22:04.637233 kubelet[1791]: E1030 00:22:04.637199 1791 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 30 00:22:04.638902 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 30 00:22:04.639010 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 30 00:22:04.639228 systemd[1]: kubelet.service: Consumed 647ms CPU time, 266.2M memory peak. Oct 30 00:22:14.806957 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Oct 30 00:22:14.808320 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:22:15.187749 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:22:15.190612 (kubelet)[1841]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 30 00:22:15.268327 kubelet[1841]: E1030 00:22:15.268290 1841 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 30 00:22:15.270925 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 30 00:22:15.271085 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 30 00:22:15.271346 systemd[1]: kubelet.service: Consumed 103ms CPU time, 111M memory peak. Oct 30 00:22:25.307023 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Oct 30 00:22:25.308150 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:22:25.519997 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:22:25.527255 (kubelet)[1856]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 30 00:22:25.550434 kubelet[1856]: E1030 00:22:25.550401 1856 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 30 00:22:25.552060 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 30 00:22:25.552144 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 30 00:22:25.552360 systemd[1]: kubelet.service: Consumed 88ms CPU time, 109.7M memory peak. Oct 30 00:22:32.321137 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Oct 30 00:22:32.322262 systemd[1]: Started sshd@0-139.178.70.100:22-139.178.89.65:57724.service - OpenSSH per-connection server daemon (139.178.89.65:57724). Oct 30 00:22:32.407340 sshd[1864]: Accepted publickey for core from 139.178.89.65 port 57724 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:22:32.408210 sshd-session[1864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:22:32.411242 systemd-logind[1606]: New session 3 of user core. Oct 30 00:22:32.422252 systemd[1]: Started session-3.scope - Session 3 of User core. Oct 30 00:22:32.476190 systemd[1]: Started sshd@1-139.178.70.100:22-139.178.89.65:57728.service - OpenSSH per-connection server daemon (139.178.89.65:57728). Oct 30 00:22:32.519923 sshd[1870]: Accepted publickey for core from 139.178.89.65 port 57728 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:22:32.521090 sshd-session[1870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:22:32.524682 systemd-logind[1606]: New session 4 of user core. Oct 30 00:22:32.537178 systemd[1]: Started session-4.scope - Session 4 of User core. Oct 30 00:22:32.587138 sshd[1873]: Connection closed by 139.178.89.65 port 57728 Oct 30 00:22:32.588257 sshd-session[1870]: pam_unix(sshd:session): session closed for user core Oct 30 00:22:32.593297 systemd[1]: sshd@1-139.178.70.100:22-139.178.89.65:57728.service: Deactivated successfully. Oct 30 00:22:32.594760 systemd[1]: session-4.scope: Deactivated successfully. Oct 30 00:22:32.595451 systemd-logind[1606]: Session 4 logged out. Waiting for processes to exit. Oct 30 00:22:32.596563 systemd-logind[1606]: Removed session 4. Oct 30 00:22:32.597423 systemd[1]: Started sshd@2-139.178.70.100:22-139.178.89.65:57730.service - OpenSSH per-connection server daemon (139.178.89.65:57730). Oct 30 00:22:32.643205 sshd[1879]: Accepted publickey for core from 139.178.89.65 port 57730 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:22:32.644070 sshd-session[1879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:22:32.646769 systemd-logind[1606]: New session 5 of user core. Oct 30 00:22:32.654192 systemd[1]: Started session-5.scope - Session 5 of User core. Oct 30 00:22:32.700685 sshd[1882]: Connection closed by 139.178.89.65 port 57730 Oct 30 00:22:32.700854 sshd-session[1879]: pam_unix(sshd:session): session closed for user core Oct 30 00:22:32.710408 systemd[1]: sshd@2-139.178.70.100:22-139.178.89.65:57730.service: Deactivated successfully. Oct 30 00:22:32.711469 systemd[1]: session-5.scope: Deactivated successfully. Oct 30 00:22:32.712345 systemd-logind[1606]: Session 5 logged out. Waiting for processes to exit. Oct 30 00:22:32.713807 systemd[1]: Started sshd@3-139.178.70.100:22-139.178.89.65:57734.service - OpenSSH per-connection server daemon (139.178.89.65:57734). Oct 30 00:22:32.714449 systemd-logind[1606]: Removed session 5. Oct 30 00:22:32.757129 sshd[1888]: Accepted publickey for core from 139.178.89.65 port 57734 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:22:32.758082 sshd-session[1888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:22:32.761326 systemd-logind[1606]: New session 6 of user core. Oct 30 00:22:32.772217 systemd[1]: Started session-6.scope - Session 6 of User core. Oct 30 00:22:32.821419 sshd[1891]: Connection closed by 139.178.89.65 port 57734 Oct 30 00:22:32.821799 sshd-session[1888]: pam_unix(sshd:session): session closed for user core Oct 30 00:22:32.827775 systemd[1]: sshd@3-139.178.70.100:22-139.178.89.65:57734.service: Deactivated successfully. Oct 30 00:22:32.829185 systemd[1]: session-6.scope: Deactivated successfully. Oct 30 00:22:32.829756 systemd-logind[1606]: Session 6 logged out. Waiting for processes to exit. Oct 30 00:22:32.832283 systemd[1]: Started sshd@4-139.178.70.100:22-139.178.89.65:57750.service - OpenSSH per-connection server daemon (139.178.89.65:57750). Oct 30 00:22:32.833138 systemd-logind[1606]: Removed session 6. Oct 30 00:22:32.882403 sshd[1897]: Accepted publickey for core from 139.178.89.65 port 57750 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:22:32.883174 sshd-session[1897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:22:32.886067 systemd-logind[1606]: New session 7 of user core. Oct 30 00:22:32.892181 systemd[1]: Started session-7.scope - Session 7 of User core. Oct 30 00:22:32.975499 sudo[1901]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Oct 30 00:22:32.975668 sudo[1901]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 30 00:22:32.989547 sudo[1901]: pam_unix(sudo:session): session closed for user root Oct 30 00:22:32.990603 sshd[1900]: Connection closed by 139.178.89.65 port 57750 Oct 30 00:22:32.991592 sshd-session[1897]: pam_unix(sshd:session): session closed for user core Oct 30 00:22:33.002439 systemd[1]: sshd@4-139.178.70.100:22-139.178.89.65:57750.service: Deactivated successfully. Oct 30 00:22:33.003763 systemd[1]: session-7.scope: Deactivated successfully. Oct 30 00:22:33.004511 systemd-logind[1606]: Session 7 logged out. Waiting for processes to exit. Oct 30 00:22:33.006876 systemd[1]: Started sshd@5-139.178.70.100:22-139.178.89.65:57756.service - OpenSSH per-connection server daemon (139.178.89.65:57756). Oct 30 00:22:33.007649 systemd-logind[1606]: Removed session 7. Oct 30 00:22:33.051714 sshd[1907]: Accepted publickey for core from 139.178.89.65 port 57756 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:22:33.052881 sshd-session[1907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:22:33.056245 systemd-logind[1606]: New session 8 of user core. Oct 30 00:22:33.063396 systemd[1]: Started session-8.scope - Session 8 of User core. Oct 30 00:22:33.112772 sudo[1912]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Oct 30 00:22:33.113169 sudo[1912]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 30 00:22:33.118766 sudo[1912]: pam_unix(sudo:session): session closed for user root Oct 30 00:22:33.121924 sudo[1911]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Oct 30 00:22:33.122244 sudo[1911]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 30 00:22:33.128414 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 30 00:22:33.154330 augenrules[1934]: No rules Oct 30 00:22:33.155377 systemd[1]: audit-rules.service: Deactivated successfully. Oct 30 00:22:33.155629 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 30 00:22:33.156977 sudo[1911]: pam_unix(sudo:session): session closed for user root Oct 30 00:22:33.157867 sshd[1910]: Connection closed by 139.178.89.65 port 57756 Oct 30 00:22:33.159092 sshd-session[1907]: pam_unix(sshd:session): session closed for user core Oct 30 00:22:33.164983 systemd[1]: sshd@5-139.178.70.100:22-139.178.89.65:57756.service: Deactivated successfully. Oct 30 00:22:33.166484 systemd[1]: session-8.scope: Deactivated successfully. Oct 30 00:22:33.167812 systemd-logind[1606]: Session 8 logged out. Waiting for processes to exit. Oct 30 00:22:33.169235 systemd[1]: Started sshd@6-139.178.70.100:22-139.178.89.65:57770.service - OpenSSH per-connection server daemon (139.178.89.65:57770). Oct 30 00:22:33.170271 systemd-logind[1606]: Removed session 8. Oct 30 00:22:33.218936 sshd[1943]: Accepted publickey for core from 139.178.89.65 port 57770 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:22:33.219689 sshd-session[1943]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:22:33.222202 systemd-logind[1606]: New session 9 of user core. Oct 30 00:22:33.232392 systemd[1]: Started session-9.scope - Session 9 of User core. Oct 30 00:22:33.280480 sudo[1947]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Oct 30 00:22:33.280640 sudo[1947]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 30 00:22:33.699050 systemd[1]: Starting docker.service - Docker Application Container Engine... Oct 30 00:22:33.709414 (dockerd)[1964]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Oct 30 00:22:34.017102 dockerd[1964]: time="2025-10-30T00:22:34.016787402Z" level=info msg="Starting up" Oct 30 00:22:34.017643 dockerd[1964]: time="2025-10-30T00:22:34.017608610Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Oct 30 00:22:34.024227 dockerd[1964]: time="2025-10-30T00:22:34.024195662Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Oct 30 00:22:34.064321 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport513024203-merged.mount: Deactivated successfully. Oct 30 00:22:34.112336 dockerd[1964]: time="2025-10-30T00:22:34.112299340Z" level=info msg="Loading containers: start." Oct 30 00:22:34.125052 kernel: Initializing XFRM netlink socket Oct 30 00:22:34.539792 systemd-networkd[1507]: docker0: Link UP Oct 30 00:22:34.612284 dockerd[1964]: time="2025-10-30T00:22:34.612237999Z" level=info msg="Loading containers: done." Oct 30 00:22:34.683792 dockerd[1964]: time="2025-10-30T00:22:34.683750324Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Oct 30 00:22:34.683917 dockerd[1964]: time="2025-10-30T00:22:34.683828272Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Oct 30 00:22:34.683917 dockerd[1964]: time="2025-10-30T00:22:34.683902494Z" level=info msg="Initializing buildkit" Oct 30 00:22:34.776707 dockerd[1964]: time="2025-10-30T00:22:34.776624869Z" level=info msg="Completed buildkit initialization" Oct 30 00:22:34.784004 dockerd[1964]: time="2025-10-30T00:22:34.783952042Z" level=info msg="Daemon has completed initialization" Oct 30 00:22:34.785620 dockerd[1964]: time="2025-10-30T00:22:34.784106787Z" level=info msg="API listen on /run/docker.sock" Oct 30 00:22:34.785109 systemd[1]: Started docker.service - Docker Application Container Engine. Oct 30 00:22:35.557074 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Oct 30 00:22:35.558747 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:22:36.322124 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:22:36.332346 (kubelet)[2182]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 30 00:22:36.364186 kubelet[2182]: E1030 00:22:36.364151 2182 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 30 00:22:36.365976 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 30 00:22:36.366161 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 30 00:22:36.366515 systemd[1]: kubelet.service: Consumed 113ms CPU time, 107.6M memory peak. Oct 30 00:22:36.415474 containerd[1627]: time="2025-10-30T00:22:36.415447776Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Oct 30 00:22:37.048523 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2084615467.mount: Deactivated successfully. Oct 30 00:22:38.138576 containerd[1627]: time="2025-10-30T00:22:38.138540649Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:38.142957 containerd[1627]: time="2025-10-30T00:22:38.142935286Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=30114893" Oct 30 00:22:38.147632 containerd[1627]: time="2025-10-30T00:22:38.147610888Z" level=info msg="ImageCreate event name:\"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:38.156246 containerd[1627]: time="2025-10-30T00:22:38.156204672Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:38.156700 containerd[1627]: time="2025-10-30T00:22:38.156582574Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"30111492\" in 1.741110333s" Oct 30 00:22:38.156700 containerd[1627]: time="2025-10-30T00:22:38.156603335Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\"" Oct 30 00:22:38.156962 containerd[1627]: time="2025-10-30T00:22:38.156951122Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Oct 30 00:22:40.427619 containerd[1627]: time="2025-10-30T00:22:40.427104568Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:40.431619 containerd[1627]: time="2025-10-30T00:22:40.431601724Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=26020844" Oct 30 00:22:40.435249 containerd[1627]: time="2025-10-30T00:22:40.435234155Z" level=info msg="ImageCreate event name:\"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:40.441753 containerd[1627]: time="2025-10-30T00:22:40.441729237Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:40.442442 containerd[1627]: time="2025-10-30T00:22:40.442418449Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"27681301\" in 2.285415116s" Oct 30 00:22:40.442507 containerd[1627]: time="2025-10-30T00:22:40.442491723Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\"" Oct 30 00:22:40.442909 containerd[1627]: time="2025-10-30T00:22:40.442869719Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Oct 30 00:22:41.856710 containerd[1627]: time="2025-10-30T00:22:41.856072384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:41.863850 containerd[1627]: time="2025-10-30T00:22:41.863829163Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=20155568" Oct 30 00:22:41.871745 containerd[1627]: time="2025-10-30T00:22:41.871728557Z" level=info msg="ImageCreate event name:\"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:41.881272 containerd[1627]: time="2025-10-30T00:22:41.881256192Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:41.881905 containerd[1627]: time="2025-10-30T00:22:41.881881131Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"21816043\" in 1.438911261s" Oct 30 00:22:41.881949 containerd[1627]: time="2025-10-30T00:22:41.881906243Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\"" Oct 30 00:22:41.882551 containerd[1627]: time="2025-10-30T00:22:41.882525439Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Oct 30 00:22:43.123695 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4149699719.mount: Deactivated successfully. Oct 30 00:22:43.564213 containerd[1627]: time="2025-10-30T00:22:43.564141319Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:43.569064 containerd[1627]: time="2025-10-30T00:22:43.569049010Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=31929469" Oct 30 00:22:43.574086 containerd[1627]: time="2025-10-30T00:22:43.574068657Z" level=info msg="ImageCreate event name:\"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:43.582528 containerd[1627]: time="2025-10-30T00:22:43.582506964Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:43.582934 containerd[1627]: time="2025-10-30T00:22:43.582705874Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"31928488\" in 1.699760674s" Oct 30 00:22:43.582934 containerd[1627]: time="2025-10-30T00:22:43.582721880Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\"" Oct 30 00:22:43.583012 containerd[1627]: time="2025-10-30T00:22:43.583004908Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Oct 30 00:22:44.314787 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3597007472.mount: Deactivated successfully. Oct 30 00:22:45.484240 containerd[1627]: time="2025-10-30T00:22:45.484205411Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:45.485307 containerd[1627]: time="2025-10-30T00:22:45.485177361Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Oct 30 00:22:45.485451 containerd[1627]: time="2025-10-30T00:22:45.485434605Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:45.488601 containerd[1627]: time="2025-10-30T00:22:45.488473851Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:45.488963 containerd[1627]: time="2025-10-30T00:22:45.488759132Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.905741041s" Oct 30 00:22:45.488963 containerd[1627]: time="2025-10-30T00:22:45.488775581Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Oct 30 00:22:45.489184 containerd[1627]: time="2025-10-30T00:22:45.489167275Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Oct 30 00:22:45.935171 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1242144357.mount: Deactivated successfully. Oct 30 00:22:45.937534 containerd[1627]: time="2025-10-30T00:22:45.937076940Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 30 00:22:45.937534 containerd[1627]: time="2025-10-30T00:22:45.937455867Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Oct 30 00:22:45.937534 containerd[1627]: time="2025-10-30T00:22:45.937511117Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 30 00:22:45.938633 containerd[1627]: time="2025-10-30T00:22:45.938621005Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 30 00:22:45.939073 containerd[1627]: time="2025-10-30T00:22:45.939057671Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 449.835101ms" Oct 30 00:22:45.939112 containerd[1627]: time="2025-10-30T00:22:45.939073420Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Oct 30 00:22:45.939358 containerd[1627]: time="2025-10-30T00:22:45.939344450Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Oct 30 00:22:46.394476 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Oct 30 00:22:46.396363 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:22:46.407211 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3380241674.mount: Deactivated successfully. Oct 30 00:22:46.883467 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:22:46.886111 (kubelet)[2334]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 30 00:22:46.939810 kubelet[2334]: E1030 00:22:46.939767 2334 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 30 00:22:46.941090 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 30 00:22:46.941189 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 30 00:22:46.941796 systemd[1]: kubelet.service: Consumed 104ms CPU time, 109.3M memory peak. Oct 30 00:22:47.732355 update_engine[1608]: I20251030 00:22:47.732055 1608 update_attempter.cc:509] Updating boot flags... Oct 30 00:22:52.101071 containerd[1627]: time="2025-10-30T00:22:52.100776091Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:52.102203 containerd[1627]: time="2025-10-30T00:22:52.102174967Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58378433" Oct 30 00:22:52.102506 containerd[1627]: time="2025-10-30T00:22:52.102486986Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:52.109740 containerd[1627]: time="2025-10-30T00:22:52.108422974Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:22:52.109859 containerd[1627]: time="2025-10-30T00:22:52.109750498Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 6.170387513s" Oct 30 00:22:52.109859 containerd[1627]: time="2025-10-30T00:22:52.109784568Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Oct 30 00:22:54.505397 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:22:54.505792 systemd[1]: kubelet.service: Consumed 104ms CPU time, 109.3M memory peak. Oct 30 00:22:54.507800 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:22:54.529182 systemd[1]: Reload requested from client PID 2437 ('systemctl') (unit session-9.scope)... Oct 30 00:22:54.529191 systemd[1]: Reloading... Oct 30 00:22:54.616054 zram_generator::config[2486]: No configuration found. Oct 30 00:22:54.676924 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 30 00:22:54.743749 systemd[1]: Reloading finished in 214 ms. Oct 30 00:22:54.828304 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Oct 30 00:22:54.828380 systemd[1]: kubelet.service: Failed with result 'signal'. Oct 30 00:22:54.828753 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:22:54.831300 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:22:55.272818 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:22:55.279227 (kubelet)[2550]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 30 00:22:55.331044 kubelet[2550]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 30 00:22:55.331044 kubelet[2550]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 30 00:22:55.331044 kubelet[2550]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 30 00:22:55.344514 kubelet[2550]: I1030 00:22:55.344466 2550 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 30 00:22:55.794027 kubelet[2550]: I1030 00:22:55.793999 2550 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Oct 30 00:22:55.794027 kubelet[2550]: I1030 00:22:55.794019 2550 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 30 00:22:55.795957 kubelet[2550]: I1030 00:22:55.794385 2550 server.go:956] "Client rotation is on, will bootstrap in background" Oct 30 00:22:55.870140 kubelet[2550]: I1030 00:22:55.870115 2550 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 30 00:22:55.873138 kubelet[2550]: E1030 00:22:55.873107 2550 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://139.178.70.100:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Oct 30 00:22:55.886262 kubelet[2550]: I1030 00:22:55.886192 2550 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 30 00:22:55.900397 kubelet[2550]: I1030 00:22:55.900368 2550 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Oct 30 00:22:55.904253 kubelet[2550]: I1030 00:22:55.904204 2550 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 30 00:22:55.907180 kubelet[2550]: I1030 00:22:55.904254 2550 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 30 00:22:55.907180 kubelet[2550]: I1030 00:22:55.907183 2550 topology_manager.go:138] "Creating topology manager with none policy" Oct 30 00:22:55.907357 kubelet[2550]: I1030 00:22:55.907198 2550 container_manager_linux.go:303] "Creating device plugin manager" Oct 30 00:22:55.910166 kubelet[2550]: I1030 00:22:55.910131 2550 state_mem.go:36] "Initialized new in-memory state store" Oct 30 00:22:55.929488 kubelet[2550]: I1030 00:22:55.929450 2550 kubelet.go:480] "Attempting to sync node with API server" Oct 30 00:22:55.929488 kubelet[2550]: I1030 00:22:55.929494 2550 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 30 00:22:55.929666 kubelet[2550]: I1030 00:22:55.929524 2550 kubelet.go:386] "Adding apiserver pod source" Oct 30 00:22:55.944959 kubelet[2550]: I1030 00:22:55.944850 2550 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 30 00:22:55.985649 kubelet[2550]: E1030 00:22:55.985513 2550 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.100:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Oct 30 00:22:55.985649 kubelet[2550]: E1030 00:22:55.985620 2550 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.100:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Oct 30 00:22:55.987553 kubelet[2550]: I1030 00:22:55.987099 2550 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 30 00:22:55.987553 kubelet[2550]: I1030 00:22:55.987424 2550 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Oct 30 00:22:55.988088 kubelet[2550]: W1030 00:22:55.988078 2550 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Oct 30 00:22:55.991966 kubelet[2550]: I1030 00:22:55.991948 2550 watchdog_linux.go:99] "Systemd watchdog is not enabled" Oct 30 00:22:55.992125 kubelet[2550]: I1030 00:22:55.992116 2550 server.go:1289] "Started kubelet" Oct 30 00:22:55.995118 kubelet[2550]: I1030 00:22:55.995054 2550 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 30 00:22:55.997169 kubelet[2550]: E1030 00:22:55.994922 2550 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.100:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.100:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18731d028b0de1b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-10-30 00:22:55.992078772 +0000 UTC m=+0.710391353,LastTimestamp:2025-10-30 00:22:55.992078772 +0000 UTC m=+0.710391353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Oct 30 00:22:55.998373 kubelet[2550]: I1030 00:22:55.998357 2550 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Oct 30 00:22:56.003951 kubelet[2550]: I1030 00:22:56.003617 2550 volume_manager.go:297] "Starting Kubelet Volume Manager" Oct 30 00:22:56.003951 kubelet[2550]: I1030 00:22:56.003788 2550 server.go:317] "Adding debug handlers to kubelet server" Oct 30 00:22:56.008345 kubelet[2550]: I1030 00:22:56.007613 2550 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Oct 30 00:22:56.008345 kubelet[2550]: I1030 00:22:56.007668 2550 reconciler.go:26] "Reconciler: start to sync state" Oct 30 00:22:56.011639 kubelet[2550]: E1030 00:22:56.003788 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:22:56.014994 kubelet[2550]: I1030 00:22:56.014050 2550 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 30 00:22:56.014994 kubelet[2550]: I1030 00:22:56.014253 2550 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 30 00:22:56.014994 kubelet[2550]: I1030 00:22:56.014288 2550 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Oct 30 00:22:56.014994 kubelet[2550]: I1030 00:22:56.014425 2550 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 30 00:22:56.015151 kubelet[2550]: I1030 00:22:56.015026 2550 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Oct 30 00:22:56.015151 kubelet[2550]: I1030 00:22:56.015061 2550 status_manager.go:230] "Starting to sync pod status with apiserver" Oct 30 00:22:56.015151 kubelet[2550]: I1030 00:22:56.015075 2550 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 30 00:22:56.015151 kubelet[2550]: I1030 00:22:56.015083 2550 kubelet.go:2436] "Starting kubelet main sync loop" Oct 30 00:22:56.015151 kubelet[2550]: E1030 00:22:56.015108 2550 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 30 00:22:56.018122 kubelet[2550]: E1030 00:22:56.018100 2550 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.100:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Oct 30 00:22:56.018841 kubelet[2550]: E1030 00:22:56.018813 2550 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.100:6443: connect: connection refused" interval="200ms" Oct 30 00:22:56.020370 kubelet[2550]: I1030 00:22:56.020345 2550 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 30 00:22:56.020703 kubelet[2550]: E1030 00:22:56.020393 2550 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.100:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Oct 30 00:22:56.020703 kubelet[2550]: E1030 00:22:56.020477 2550 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 30 00:22:56.021401 kubelet[2550]: I1030 00:22:56.021385 2550 factory.go:223] Registration of the containerd container factory successfully Oct 30 00:22:56.021401 kubelet[2550]: I1030 00:22:56.021396 2550 factory.go:223] Registration of the systemd container factory successfully Oct 30 00:22:56.036004 kubelet[2550]: I1030 00:22:56.035981 2550 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 30 00:22:56.036004 kubelet[2550]: I1030 00:22:56.035995 2550 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 30 00:22:56.036150 kubelet[2550]: I1030 00:22:56.036066 2550 state_mem.go:36] "Initialized new in-memory state store" Oct 30 00:22:56.037292 kubelet[2550]: I1030 00:22:56.037270 2550 policy_none.go:49] "None policy: Start" Oct 30 00:22:56.037292 kubelet[2550]: I1030 00:22:56.037290 2550 memory_manager.go:186] "Starting memorymanager" policy="None" Oct 30 00:22:56.037388 kubelet[2550]: I1030 00:22:56.037298 2550 state_mem.go:35] "Initializing new in-memory state store" Oct 30 00:22:56.042545 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Oct 30 00:22:56.055412 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Oct 30 00:22:56.058600 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Oct 30 00:22:56.082063 kubelet[2550]: E1030 00:22:56.082022 2550 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Oct 30 00:22:56.082315 kubelet[2550]: I1030 00:22:56.082175 2550 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 30 00:22:56.082315 kubelet[2550]: I1030 00:22:56.082186 2550 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 30 00:22:56.082479 kubelet[2550]: I1030 00:22:56.082467 2550 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 30 00:22:56.083541 kubelet[2550]: E1030 00:22:56.083508 2550 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 30 00:22:56.083541 kubelet[2550]: E1030 00:22:56.083530 2550 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Oct 30 00:22:56.127788 systemd[1]: Created slice kubepods-burstable-pod20c890a246d840d308022312da9174cb.slice - libcontainer container kubepods-burstable-pod20c890a246d840d308022312da9174cb.slice. Oct 30 00:22:56.159274 kubelet[2550]: E1030 00:22:56.159160 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:56.163076 systemd[1]: Created slice kubepods-burstable-podd13d96f639b65e57f439b4396b605564.slice - libcontainer container kubepods-burstable-podd13d96f639b65e57f439b4396b605564.slice. Oct 30 00:22:56.164484 kubelet[2550]: E1030 00:22:56.164404 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:56.166440 systemd[1]: Created slice kubepods-burstable-pod7489e264250a14b73066037d270c9ecb.slice - libcontainer container kubepods-burstable-pod7489e264250a14b73066037d270c9ecb.slice. Oct 30 00:22:56.167807 kubelet[2550]: E1030 00:22:56.167786 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:56.195014 kubelet[2550]: I1030 00:22:56.194994 2550 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 30 00:22:56.195221 kubelet[2550]: E1030 00:22:56.195202 2550 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.100:6443/api/v1/nodes\": dial tcp 139.178.70.100:6443: connect: connection refused" node="localhost" Oct 30 00:22:56.220059 kubelet[2550]: E1030 00:22:56.220019 2550 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.100:6443: connect: connection refused" interval="400ms" Oct 30 00:22:56.309775 kubelet[2550]: I1030 00:22:56.309688 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7489e264250a14b73066037d270c9ecb-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"7489e264250a14b73066037d270c9ecb\") " pod="kube-system/kube-apiserver-localhost" Oct 30 00:22:56.309775 kubelet[2550]: I1030 00:22:56.309722 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7489e264250a14b73066037d270c9ecb-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"7489e264250a14b73066037d270c9ecb\") " pod="kube-system/kube-apiserver-localhost" Oct 30 00:22:56.309775 kubelet[2550]: I1030 00:22:56.309738 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:22:56.309775 kubelet[2550]: I1030 00:22:56.309751 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:22:56.309940 kubelet[2550]: I1030 00:22:56.309793 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d13d96f639b65e57f439b4396b605564-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d13d96f639b65e57f439b4396b605564\") " pod="kube-system/kube-scheduler-localhost" Oct 30 00:22:56.309940 kubelet[2550]: I1030 00:22:56.309817 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7489e264250a14b73066037d270c9ecb-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"7489e264250a14b73066037d270c9ecb\") " pod="kube-system/kube-apiserver-localhost" Oct 30 00:22:56.309940 kubelet[2550]: I1030 00:22:56.309831 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:22:56.309940 kubelet[2550]: I1030 00:22:56.309848 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:22:56.309940 kubelet[2550]: I1030 00:22:56.309871 2550 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:22:56.396781 kubelet[2550]: I1030 00:22:56.396750 2550 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 30 00:22:56.397338 kubelet[2550]: E1030 00:22:56.397302 2550 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.100:6443/api/v1/nodes\": dial tcp 139.178.70.100:6443: connect: connection refused" node="localhost" Oct 30 00:22:56.460886 containerd[1627]: time="2025-10-30T00:22:56.460854852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:20c890a246d840d308022312da9174cb,Namespace:kube-system,Attempt:0,}" Oct 30 00:22:56.465230 containerd[1627]: time="2025-10-30T00:22:56.465208544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d13d96f639b65e57f439b4396b605564,Namespace:kube-system,Attempt:0,}" Oct 30 00:22:56.468748 containerd[1627]: time="2025-10-30T00:22:56.468725260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:7489e264250a14b73066037d270c9ecb,Namespace:kube-system,Attempt:0,}" Oct 30 00:22:56.580472 containerd[1627]: time="2025-10-30T00:22:56.580378759Z" level=info msg="connecting to shim b5ebb544c9b38f62a3bfb08c44ff67a2f38ad0d31929d6f47dbe492da01c71bf" address="unix:///run/containerd/s/2ef9c805aa6a59b333cd36119e54dc21439e2c45fbcb606dd7bfd00cddeff8d6" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:22:56.593762 containerd[1627]: time="2025-10-30T00:22:56.586979476Z" level=info msg="connecting to shim bc3b9fcc1a1986b5c50fd783569c6b2df7e01ad4e004609636bbe2e1d454f66b" address="unix:///run/containerd/s/d02ec0cce2ea8b9bcc693885b22c98c8eebb9cdf3fdb1fc21caef655d6cf38a1" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:22:56.597142 containerd[1627]: time="2025-10-30T00:22:56.596274353Z" level=info msg="connecting to shim 16b35b32436d7cd0960f0f9b60f535217b75d3cb29dd33518060367bd5f994c0" address="unix:///run/containerd/s/ee0869e4db822c1fd8b67ffa80ef46d06bfec94880264d29e8292fe095b34758" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:22:56.621309 kubelet[2550]: E1030 00:22:56.621283 2550 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.100:6443: connect: connection refused" interval="800ms" Oct 30 00:22:56.678232 systemd[1]: Started cri-containerd-b5ebb544c9b38f62a3bfb08c44ff67a2f38ad0d31929d6f47dbe492da01c71bf.scope - libcontainer container b5ebb544c9b38f62a3bfb08c44ff67a2f38ad0d31929d6f47dbe492da01c71bf. Oct 30 00:22:56.682393 systemd[1]: Started cri-containerd-16b35b32436d7cd0960f0f9b60f535217b75d3cb29dd33518060367bd5f994c0.scope - libcontainer container 16b35b32436d7cd0960f0f9b60f535217b75d3cb29dd33518060367bd5f994c0. Oct 30 00:22:56.684180 systemd[1]: Started cri-containerd-bc3b9fcc1a1986b5c50fd783569c6b2df7e01ad4e004609636bbe2e1d454f66b.scope - libcontainer container bc3b9fcc1a1986b5c50fd783569c6b2df7e01ad4e004609636bbe2e1d454f66b. Oct 30 00:22:56.752942 containerd[1627]: time="2025-10-30T00:22:56.752918479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d13d96f639b65e57f439b4396b605564,Namespace:kube-system,Attempt:0,} returns sandbox id \"bc3b9fcc1a1986b5c50fd783569c6b2df7e01ad4e004609636bbe2e1d454f66b\"" Oct 30 00:22:56.757587 containerd[1627]: time="2025-10-30T00:22:56.757558924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:7489e264250a14b73066037d270c9ecb,Namespace:kube-system,Attempt:0,} returns sandbox id \"16b35b32436d7cd0960f0f9b60f535217b75d3cb29dd33518060367bd5f994c0\"" Oct 30 00:22:56.761619 containerd[1627]: time="2025-10-30T00:22:56.761600533Z" level=info msg="CreateContainer within sandbox \"bc3b9fcc1a1986b5c50fd783569c6b2df7e01ad4e004609636bbe2e1d454f66b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Oct 30 00:22:56.768631 containerd[1627]: time="2025-10-30T00:22:56.768401550Z" level=info msg="Container 655d909c4bba311bd9e5885dffdc36ed153f94bce0208b9cdd1c0d1aa6041ec2: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:22:56.773546 containerd[1627]: time="2025-10-30T00:22:56.773445465Z" level=info msg="CreateContainer within sandbox \"16b35b32436d7cd0960f0f9b60f535217b75d3cb29dd33518060367bd5f994c0\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Oct 30 00:22:56.774385 containerd[1627]: time="2025-10-30T00:22:56.774371424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:20c890a246d840d308022312da9174cb,Namespace:kube-system,Attempt:0,} returns sandbox id \"b5ebb544c9b38f62a3bfb08c44ff67a2f38ad0d31929d6f47dbe492da01c71bf\"" Oct 30 00:22:56.779212 containerd[1627]: time="2025-10-30T00:22:56.779185944Z" level=info msg="CreateContainer within sandbox \"b5ebb544c9b38f62a3bfb08c44ff67a2f38ad0d31929d6f47dbe492da01c71bf\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Oct 30 00:22:56.779740 containerd[1627]: time="2025-10-30T00:22:56.779560188Z" level=info msg="Container 434b3058625e04cbadf3d3dc6f5b42d072b4c43e6d6ff7348cae6e83a10f4f46: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:22:56.782270 containerd[1627]: time="2025-10-30T00:22:56.782245810Z" level=info msg="CreateContainer within sandbox \"bc3b9fcc1a1986b5c50fd783569c6b2df7e01ad4e004609636bbe2e1d454f66b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"655d909c4bba311bd9e5885dffdc36ed153f94bce0208b9cdd1c0d1aa6041ec2\"" Oct 30 00:22:56.782704 containerd[1627]: time="2025-10-30T00:22:56.782691732Z" level=info msg="StartContainer for \"655d909c4bba311bd9e5885dffdc36ed153f94bce0208b9cdd1c0d1aa6041ec2\"" Oct 30 00:22:56.783908 containerd[1627]: time="2025-10-30T00:22:56.783608444Z" level=info msg="connecting to shim 655d909c4bba311bd9e5885dffdc36ed153f94bce0208b9cdd1c0d1aa6041ec2" address="unix:///run/containerd/s/d02ec0cce2ea8b9bcc693885b22c98c8eebb9cdf3fdb1fc21caef655d6cf38a1" protocol=ttrpc version=3 Oct 30 00:22:56.786602 containerd[1627]: time="2025-10-30T00:22:56.786571510Z" level=info msg="CreateContainer within sandbox \"16b35b32436d7cd0960f0f9b60f535217b75d3cb29dd33518060367bd5f994c0\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"434b3058625e04cbadf3d3dc6f5b42d072b4c43e6d6ff7348cae6e83a10f4f46\"" Oct 30 00:22:56.787129 containerd[1627]: time="2025-10-30T00:22:56.787111198Z" level=info msg="StartContainer for \"434b3058625e04cbadf3d3dc6f5b42d072b4c43e6d6ff7348cae6e83a10f4f46\"" Oct 30 00:22:56.788429 containerd[1627]: time="2025-10-30T00:22:56.788385029Z" level=info msg="connecting to shim 434b3058625e04cbadf3d3dc6f5b42d072b4c43e6d6ff7348cae6e83a10f4f46" address="unix:///run/containerd/s/ee0869e4db822c1fd8b67ffa80ef46d06bfec94880264d29e8292fe095b34758" protocol=ttrpc version=3 Oct 30 00:22:56.788881 containerd[1627]: time="2025-10-30T00:22:56.788814651Z" level=info msg="Container 82316bf0b428afb1739bf0e845a2c5a1219d0a98182bdda868d90a76335fec5b: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:22:56.794056 containerd[1627]: time="2025-10-30T00:22:56.794003751Z" level=info msg="CreateContainer within sandbox \"b5ebb544c9b38f62a3bfb08c44ff67a2f38ad0d31929d6f47dbe492da01c71bf\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"82316bf0b428afb1739bf0e845a2c5a1219d0a98182bdda868d90a76335fec5b\"" Oct 30 00:22:56.794396 containerd[1627]: time="2025-10-30T00:22:56.794379499Z" level=info msg="StartContainer for \"82316bf0b428afb1739bf0e845a2c5a1219d0a98182bdda868d90a76335fec5b\"" Oct 30 00:22:56.796050 containerd[1627]: time="2025-10-30T00:22:56.796013209Z" level=info msg="connecting to shim 82316bf0b428afb1739bf0e845a2c5a1219d0a98182bdda868d90a76335fec5b" address="unix:///run/containerd/s/2ef9c805aa6a59b333cd36119e54dc21439e2c45fbcb606dd7bfd00cddeff8d6" protocol=ttrpc version=3 Oct 30 00:22:56.802123 kubelet[2550]: I1030 00:22:56.802101 2550 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 30 00:22:56.802322 kubelet[2550]: E1030 00:22:56.802307 2550 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.100:6443/api/v1/nodes\": dial tcp 139.178.70.100:6443: connect: connection refused" node="localhost" Oct 30 00:22:56.804215 systemd[1]: Started cri-containerd-655d909c4bba311bd9e5885dffdc36ed153f94bce0208b9cdd1c0d1aa6041ec2.scope - libcontainer container 655d909c4bba311bd9e5885dffdc36ed153f94bce0208b9cdd1c0d1aa6041ec2. Oct 30 00:22:56.815177 systemd[1]: Started cri-containerd-434b3058625e04cbadf3d3dc6f5b42d072b4c43e6d6ff7348cae6e83a10f4f46.scope - libcontainer container 434b3058625e04cbadf3d3dc6f5b42d072b4c43e6d6ff7348cae6e83a10f4f46. Oct 30 00:22:56.818843 systemd[1]: Started cri-containerd-82316bf0b428afb1739bf0e845a2c5a1219d0a98182bdda868d90a76335fec5b.scope - libcontainer container 82316bf0b428afb1739bf0e845a2c5a1219d0a98182bdda868d90a76335fec5b. Oct 30 00:22:56.879429 containerd[1627]: time="2025-10-30T00:22:56.879346794Z" level=info msg="StartContainer for \"82316bf0b428afb1739bf0e845a2c5a1219d0a98182bdda868d90a76335fec5b\" returns successfully" Oct 30 00:22:56.882059 containerd[1627]: time="2025-10-30T00:22:56.881210727Z" level=info msg="StartContainer for \"434b3058625e04cbadf3d3dc6f5b42d072b4c43e6d6ff7348cae6e83a10f4f46\" returns successfully" Oct 30 00:22:56.893811 containerd[1627]: time="2025-10-30T00:22:56.893771268Z" level=info msg="StartContainer for \"655d909c4bba311bd9e5885dffdc36ed153f94bce0208b9cdd1c0d1aa6041ec2\" returns successfully" Oct 30 00:22:56.915191 kubelet[2550]: E1030 00:22:56.915166 2550 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://139.178.70.100:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Oct 30 00:22:57.026304 kubelet[2550]: E1030 00:22:57.026287 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:57.026937 kubelet[2550]: E1030 00:22:57.026904 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:57.030046 kubelet[2550]: E1030 00:22:57.029908 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:57.063046 kubelet[2550]: E1030 00:22:57.062592 2550 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://139.178.70.100:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Oct 30 00:22:57.322786 kubelet[2550]: E1030 00:22:57.322762 2550 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://139.178.70.100:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Oct 30 00:22:57.341206 kubelet[2550]: E1030 00:22:57.341176 2550 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://139.178.70.100:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.100:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Oct 30 00:22:57.422162 kubelet[2550]: E1030 00:22:57.422136 2550 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.100:6443: connect: connection refused" interval="1.6s" Oct 30 00:22:57.604394 kubelet[2550]: I1030 00:22:57.604333 2550 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 30 00:22:57.604538 kubelet[2550]: E1030 00:22:57.604524 2550 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.70.100:6443/api/v1/nodes\": dial tcp 139.178.70.100:6443: connect: connection refused" node="localhost" Oct 30 00:22:58.031354 kubelet[2550]: E1030 00:22:58.031202 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:58.031354 kubelet[2550]: E1030 00:22:58.031235 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:58.031666 kubelet[2550]: E1030 00:22:58.031590 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:58.975478 kubelet[2550]: E1030 00:22:58.975440 2550 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Oct 30 00:22:59.025455 kubelet[2550]: E1030 00:22:59.025425 2550 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Oct 30 00:22:59.033273 kubelet[2550]: E1030 00:22:59.033162 2550 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Oct 30 00:22:59.205927 kubelet[2550]: I1030 00:22:59.205899 2550 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 30 00:22:59.249101 kubelet[2550]: I1030 00:22:59.248912 2550 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Oct 30 00:22:59.249101 kubelet[2550]: E1030 00:22:59.248938 2550 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Oct 30 00:22:59.256230 kubelet[2550]: E1030 00:22:59.256207 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:22:59.357172 kubelet[2550]: E1030 00:22:59.357146 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:22:59.457289 kubelet[2550]: E1030 00:22:59.457260 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:22:59.558025 kubelet[2550]: E1030 00:22:59.558006 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:22:59.658367 kubelet[2550]: E1030 00:22:59.658341 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:22:59.758827 kubelet[2550]: E1030 00:22:59.758801 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:22:59.859617 kubelet[2550]: E1030 00:22:59.859550 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:22:59.960278 kubelet[2550]: E1030 00:22:59.960256 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:23:00.060903 kubelet[2550]: E1030 00:23:00.060886 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:23:00.161588 kubelet[2550]: E1030 00:23:00.161523 2550 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:23:00.206304 kubelet[2550]: I1030 00:23:00.206143 2550 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 30 00:23:00.212149 kubelet[2550]: I1030 00:23:00.212083 2550 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Oct 30 00:23:00.213963 kubelet[2550]: I1030 00:23:00.213932 2550 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Oct 30 00:23:00.499891 systemd[1]: Reload requested from client PID 2825 ('systemctl') (unit session-9.scope)... Oct 30 00:23:00.500173 systemd[1]: Reloading... Oct 30 00:23:00.544066 zram_generator::config[2868]: No configuration found. Oct 30 00:23:00.629578 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Oct 30 00:23:00.705901 systemd[1]: Reloading finished in 205 ms. Oct 30 00:23:00.739164 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:23:00.752215 systemd[1]: kubelet.service: Deactivated successfully. Oct 30 00:23:00.752374 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:23:00.752404 systemd[1]: kubelet.service: Consumed 717ms CPU time, 130.1M memory peak. Oct 30 00:23:00.753838 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 30 00:23:00.960496 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 30 00:23:00.965396 (kubelet)[2936]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 30 00:23:01.070729 kubelet[2936]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 30 00:23:01.070729 kubelet[2936]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 30 00:23:01.070729 kubelet[2936]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 30 00:23:01.071037 kubelet[2936]: I1030 00:23:01.070966 2936 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 30 00:23:01.075485 kubelet[2936]: I1030 00:23:01.075467 2936 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Oct 30 00:23:01.075984 kubelet[2936]: I1030 00:23:01.075540 2936 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 30 00:23:01.075984 kubelet[2936]: I1030 00:23:01.075732 2936 server.go:956] "Client rotation is on, will bootstrap in background" Oct 30 00:23:01.087429 kubelet[2936]: I1030 00:23:01.087411 2936 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Oct 30 00:23:01.102996 kubelet[2936]: I1030 00:23:01.102744 2936 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 30 00:23:01.106480 kubelet[2936]: I1030 00:23:01.106468 2936 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 30 00:23:01.114700 kubelet[2936]: I1030 00:23:01.114678 2936 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Oct 30 00:23:01.114973 kubelet[2936]: I1030 00:23:01.114953 2936 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 30 00:23:01.115151 kubelet[2936]: I1030 00:23:01.115023 2936 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 30 00:23:01.115256 kubelet[2936]: I1030 00:23:01.115246 2936 topology_manager.go:138] "Creating topology manager with none policy" Oct 30 00:23:01.115303 kubelet[2936]: I1030 00:23:01.115296 2936 container_manager_linux.go:303] "Creating device plugin manager" Oct 30 00:23:01.115370 kubelet[2936]: I1030 00:23:01.115364 2936 state_mem.go:36] "Initialized new in-memory state store" Oct 30 00:23:01.115615 kubelet[2936]: I1030 00:23:01.115564 2936 kubelet.go:480] "Attempting to sync node with API server" Oct 30 00:23:01.115615 kubelet[2936]: I1030 00:23:01.115577 2936 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 30 00:23:01.115615 kubelet[2936]: I1030 00:23:01.115594 2936 kubelet.go:386] "Adding apiserver pod source" Oct 30 00:23:01.115994 kubelet[2936]: I1030 00:23:01.115954 2936 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 30 00:23:01.120047 kubelet[2936]: I1030 00:23:01.119749 2936 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 30 00:23:01.120255 kubelet[2936]: I1030 00:23:01.120244 2936 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Oct 30 00:23:01.121852 kubelet[2936]: I1030 00:23:01.121839 2936 watchdog_linux.go:99] "Systemd watchdog is not enabled" Oct 30 00:23:01.121949 kubelet[2936]: I1030 00:23:01.121941 2936 server.go:1289] "Started kubelet" Oct 30 00:23:01.145831 kubelet[2936]: I1030 00:23:01.145723 2936 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Oct 30 00:23:01.147373 kubelet[2936]: I1030 00:23:01.147329 2936 server.go:317] "Adding debug handlers to kubelet server" Oct 30 00:23:01.148013 kubelet[2936]: I1030 00:23:01.147900 2936 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 30 00:23:01.150794 kubelet[2936]: I1030 00:23:01.150766 2936 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 30 00:23:01.151183 kubelet[2936]: I1030 00:23:01.151174 2936 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 30 00:23:01.158799 kubelet[2936]: I1030 00:23:01.158779 2936 volume_manager.go:297] "Starting Kubelet Volume Manager" Oct 30 00:23:01.158959 kubelet[2936]: E1030 00:23:01.158943 2936 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Oct 30 00:23:01.160129 kubelet[2936]: I1030 00:23:01.160083 2936 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 30 00:23:01.162224 kubelet[2936]: I1030 00:23:01.162212 2936 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Oct 30 00:23:01.162369 kubelet[2936]: I1030 00:23:01.162274 2936 reconciler.go:26] "Reconciler: start to sync state" Oct 30 00:23:01.164411 kubelet[2936]: I1030 00:23:01.164386 2936 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Oct 30 00:23:01.165768 kubelet[2936]: I1030 00:23:01.165755 2936 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Oct 30 00:23:01.165807 kubelet[2936]: I1030 00:23:01.165771 2936 status_manager.go:230] "Starting to sync pod status with apiserver" Oct 30 00:23:01.165807 kubelet[2936]: I1030 00:23:01.165784 2936 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 30 00:23:01.165807 kubelet[2936]: I1030 00:23:01.165788 2936 kubelet.go:2436] "Starting kubelet main sync loop" Oct 30 00:23:01.165957 kubelet[2936]: E1030 00:23:01.165810 2936 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 30 00:23:01.166935 kubelet[2936]: I1030 00:23:01.166926 2936 factory.go:223] Registration of the systemd container factory successfully Oct 30 00:23:01.167125 kubelet[2936]: I1030 00:23:01.167114 2936 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 30 00:23:01.173199 kubelet[2936]: I1030 00:23:01.173176 2936 factory.go:223] Registration of the containerd container factory successfully Oct 30 00:23:01.214772 kubelet[2936]: I1030 00:23:01.214750 2936 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 30 00:23:01.214772 kubelet[2936]: I1030 00:23:01.214774 2936 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 30 00:23:01.214868 kubelet[2936]: I1030 00:23:01.214787 2936 state_mem.go:36] "Initialized new in-memory state store" Oct 30 00:23:01.214897 kubelet[2936]: I1030 00:23:01.214885 2936 state_mem.go:88] "Updated default CPUSet" cpuSet="" Oct 30 00:23:01.214916 kubelet[2936]: I1030 00:23:01.214894 2936 state_mem.go:96] "Updated CPUSet assignments" assignments={} Oct 30 00:23:01.214916 kubelet[2936]: I1030 00:23:01.214905 2936 policy_none.go:49] "None policy: Start" Oct 30 00:23:01.214916 kubelet[2936]: I1030 00:23:01.214910 2936 memory_manager.go:186] "Starting memorymanager" policy="None" Oct 30 00:23:01.214916 kubelet[2936]: I1030 00:23:01.214916 2936 state_mem.go:35] "Initializing new in-memory state store" Oct 30 00:23:01.215002 kubelet[2936]: I1030 00:23:01.214992 2936 state_mem.go:75] "Updated machine memory state" Oct 30 00:23:01.217517 kubelet[2936]: E1030 00:23:01.217502 2936 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Oct 30 00:23:01.217600 kubelet[2936]: I1030 00:23:01.217588 2936 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 30 00:23:01.217625 kubelet[2936]: I1030 00:23:01.217597 2936 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 30 00:23:01.219002 kubelet[2936]: I1030 00:23:01.218946 2936 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 30 00:23:01.219328 kubelet[2936]: E1030 00:23:01.219319 2936 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 30 00:23:01.266714 kubelet[2936]: I1030 00:23:01.266512 2936 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Oct 30 00:23:01.266714 kubelet[2936]: I1030 00:23:01.266530 2936 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Oct 30 00:23:01.266714 kubelet[2936]: I1030 00:23:01.266705 2936 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 30 00:23:01.268989 kubelet[2936]: E1030 00:23:01.268971 2936 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Oct 30 00:23:01.269272 kubelet[2936]: E1030 00:23:01.269261 2936 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Oct 30 00:23:01.269529 kubelet[2936]: E1030 00:23:01.269492 2936 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Oct 30 00:23:01.319792 kubelet[2936]: I1030 00:23:01.319774 2936 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Oct 30 00:23:01.329138 kubelet[2936]: I1030 00:23:01.328779 2936 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Oct 30 00:23:01.329138 kubelet[2936]: I1030 00:23:01.328835 2936 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Oct 30 00:23:01.363684 kubelet[2936]: I1030 00:23:01.363610 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7489e264250a14b73066037d270c9ecb-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"7489e264250a14b73066037d270c9ecb\") " pod="kube-system/kube-apiserver-localhost" Oct 30 00:23:01.363684 kubelet[2936]: I1030 00:23:01.363637 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7489e264250a14b73066037d270c9ecb-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"7489e264250a14b73066037d270c9ecb\") " pod="kube-system/kube-apiserver-localhost" Oct 30 00:23:01.363684 kubelet[2936]: I1030 00:23:01.363652 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:23:01.363684 kubelet[2936]: I1030 00:23:01.363671 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:23:01.363894 kubelet[2936]: I1030 00:23:01.363706 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7489e264250a14b73066037d270c9ecb-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"7489e264250a14b73066037d270c9ecb\") " pod="kube-system/kube-apiserver-localhost" Oct 30 00:23:01.363894 kubelet[2936]: I1030 00:23:01.363721 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:23:01.363894 kubelet[2936]: I1030 00:23:01.363733 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:23:01.363894 kubelet[2936]: I1030 00:23:01.363744 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/20c890a246d840d308022312da9174cb-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"20c890a246d840d308022312da9174cb\") " pod="kube-system/kube-controller-manager-localhost" Oct 30 00:23:01.363894 kubelet[2936]: I1030 00:23:01.363765 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d13d96f639b65e57f439b4396b605564-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d13d96f639b65e57f439b4396b605564\") " pod="kube-system/kube-scheduler-localhost" Oct 30 00:23:02.146233 kubelet[2936]: I1030 00:23:02.145505 2936 apiserver.go:52] "Watching apiserver" Oct 30 00:23:02.162522 kubelet[2936]: I1030 00:23:02.162486 2936 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Oct 30 00:23:02.202018 kubelet[2936]: I1030 00:23:02.201991 2936 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Oct 30 00:23:02.208646 kubelet[2936]: E1030 00:23:02.208280 2936 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Oct 30 00:23:02.224474 kubelet[2936]: I1030 00:23:02.224441 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.224429531 podStartE2EDuration="2.224429531s" podCreationTimestamp="2025-10-30 00:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-30 00:23:02.217929186 +0000 UTC m=+1.189968990" watchObservedRunningTime="2025-10-30 00:23:02.224429531 +0000 UTC m=+1.196469326" Oct 30 00:23:02.224648 kubelet[2936]: I1030 00:23:02.224634 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=2.224630117 podStartE2EDuration="2.224630117s" podCreationTimestamp="2025-10-30 00:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-30 00:23:02.223822787 +0000 UTC m=+1.195862590" watchObservedRunningTime="2025-10-30 00:23:02.224630117 +0000 UTC m=+1.196669920" Oct 30 00:23:02.238418 kubelet[2936]: I1030 00:23:02.238171 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.238158174 podStartE2EDuration="2.238158174s" podCreationTimestamp="2025-10-30 00:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-30 00:23:02.230292666 +0000 UTC m=+1.202332458" watchObservedRunningTime="2025-10-30 00:23:02.238158174 +0000 UTC m=+1.210197973" Oct 30 00:23:05.917388 kubelet[2936]: I1030 00:23:05.917363 2936 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Oct 30 00:23:05.917899 containerd[1627]: time="2025-10-30T00:23:05.917837422Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Oct 30 00:23:05.918074 kubelet[2936]: I1030 00:23:05.917942 2936 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Oct 30 00:23:06.572114 systemd[1]: Created slice kubepods-besteffort-pod38b9be9e_2e45_49b6_8751_11d98890b8cb.slice - libcontainer container kubepods-besteffort-pod38b9be9e_2e45_49b6_8751_11d98890b8cb.slice. Oct 30 00:23:06.595099 kubelet[2936]: I1030 00:23:06.594977 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/38b9be9e-2e45-49b6-8751-11d98890b8cb-lib-modules\") pod \"kube-proxy-965s2\" (UID: \"38b9be9e-2e45-49b6-8751-11d98890b8cb\") " pod="kube-system/kube-proxy-965s2" Oct 30 00:23:06.595099 kubelet[2936]: I1030 00:23:06.595017 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9kb8\" (UniqueName: \"kubernetes.io/projected/38b9be9e-2e45-49b6-8751-11d98890b8cb-kube-api-access-j9kb8\") pod \"kube-proxy-965s2\" (UID: \"38b9be9e-2e45-49b6-8751-11d98890b8cb\") " pod="kube-system/kube-proxy-965s2" Oct 30 00:23:06.595099 kubelet[2936]: I1030 00:23:06.595054 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/38b9be9e-2e45-49b6-8751-11d98890b8cb-kube-proxy\") pod \"kube-proxy-965s2\" (UID: \"38b9be9e-2e45-49b6-8751-11d98890b8cb\") " pod="kube-system/kube-proxy-965s2" Oct 30 00:23:06.595099 kubelet[2936]: I1030 00:23:06.595070 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/38b9be9e-2e45-49b6-8751-11d98890b8cb-xtables-lock\") pod \"kube-proxy-965s2\" (UID: \"38b9be9e-2e45-49b6-8751-11d98890b8cb\") " pod="kube-system/kube-proxy-965s2" Oct 30 00:23:06.699906 kubelet[2936]: E1030 00:23:06.699878 2936 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Oct 30 00:23:06.699906 kubelet[2936]: E1030 00:23:06.699905 2936 projected.go:194] Error preparing data for projected volume kube-api-access-j9kb8 for pod kube-system/kube-proxy-965s2: configmap "kube-root-ca.crt" not found Oct 30 00:23:06.700867 kubelet[2936]: E1030 00:23:06.699953 2936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38b9be9e-2e45-49b6-8751-11d98890b8cb-kube-api-access-j9kb8 podName:38b9be9e-2e45-49b6-8751-11d98890b8cb nodeName:}" failed. No retries permitted until 2025-10-30 00:23:07.19993826 +0000 UTC m=+6.171978055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j9kb8" (UniqueName: "kubernetes.io/projected/38b9be9e-2e45-49b6-8751-11d98890b8cb-kube-api-access-j9kb8") pod "kube-proxy-965s2" (UID: "38b9be9e-2e45-49b6-8751-11d98890b8cb") : configmap "kube-root-ca.crt" not found Oct 30 00:23:07.086695 systemd[1]: Created slice kubepods-besteffort-podbe47e7df_ca6c_49ed_ac2e_327757fcb44f.slice - libcontainer container kubepods-besteffort-podbe47e7df_ca6c_49ed_ac2e_327757fcb44f.slice. Oct 30 00:23:07.096513 kubelet[2936]: I1030 00:23:07.096439 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5cm\" (UniqueName: \"kubernetes.io/projected/be47e7df-ca6c-49ed-ac2e-327757fcb44f-kube-api-access-6b5cm\") pod \"tigera-operator-7dcd859c48-5mzrl\" (UID: \"be47e7df-ca6c-49ed-ac2e-327757fcb44f\") " pod="tigera-operator/tigera-operator-7dcd859c48-5mzrl" Oct 30 00:23:07.096513 kubelet[2936]: I1030 00:23:07.096477 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/be47e7df-ca6c-49ed-ac2e-327757fcb44f-var-lib-calico\") pod \"tigera-operator-7dcd859c48-5mzrl\" (UID: \"be47e7df-ca6c-49ed-ac2e-327757fcb44f\") " pod="tigera-operator/tigera-operator-7dcd859c48-5mzrl" Oct 30 00:23:07.391406 containerd[1627]: time="2025-10-30T00:23:07.391022871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-5mzrl,Uid:be47e7df-ca6c-49ed-ac2e-327757fcb44f,Namespace:tigera-operator,Attempt:0,}" Oct 30 00:23:07.409084 containerd[1627]: time="2025-10-30T00:23:07.408319030Z" level=info msg="connecting to shim 4a9c990228b2bf4e457d60e39739d4fa9751644f64979bb4ccf2da77c9dc1079" address="unix:///run/containerd/s/39dccd52a16f343affe471f65d6f8513a099101700ac6e39ad8162af13dd9555" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:07.430162 systemd[1]: Started cri-containerd-4a9c990228b2bf4e457d60e39739d4fa9751644f64979bb4ccf2da77c9dc1079.scope - libcontainer container 4a9c990228b2bf4e457d60e39739d4fa9751644f64979bb4ccf2da77c9dc1079. Oct 30 00:23:07.461560 containerd[1627]: time="2025-10-30T00:23:07.461537140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-5mzrl,Uid:be47e7df-ca6c-49ed-ac2e-327757fcb44f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4a9c990228b2bf4e457d60e39739d4fa9751644f64979bb4ccf2da77c9dc1079\"" Oct 30 00:23:07.462816 containerd[1627]: time="2025-10-30T00:23:07.462802532Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Oct 30 00:23:07.480996 containerd[1627]: time="2025-10-30T00:23:07.480974711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-965s2,Uid:38b9be9e-2e45-49b6-8751-11d98890b8cb,Namespace:kube-system,Attempt:0,}" Oct 30 00:23:07.496242 containerd[1627]: time="2025-10-30T00:23:07.496218493Z" level=info msg="connecting to shim 67abb50ec8c6b325b1b0c536bf508b9da83ac3556ce0037be0292dabb553f192" address="unix:///run/containerd/s/f1e056e7b58642cf306618669b14106a06ad4a5b324f2a8abb638c303bc98532" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:07.511158 systemd[1]: Started cri-containerd-67abb50ec8c6b325b1b0c536bf508b9da83ac3556ce0037be0292dabb553f192.scope - libcontainer container 67abb50ec8c6b325b1b0c536bf508b9da83ac3556ce0037be0292dabb553f192. Oct 30 00:23:07.526848 containerd[1627]: time="2025-10-30T00:23:07.526828630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-965s2,Uid:38b9be9e-2e45-49b6-8751-11d98890b8cb,Namespace:kube-system,Attempt:0,} returns sandbox id \"67abb50ec8c6b325b1b0c536bf508b9da83ac3556ce0037be0292dabb553f192\"" Oct 30 00:23:07.531134 containerd[1627]: time="2025-10-30T00:23:07.531107254Z" level=info msg="CreateContainer within sandbox \"67abb50ec8c6b325b1b0c536bf508b9da83ac3556ce0037be0292dabb553f192\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Oct 30 00:23:07.535733 containerd[1627]: time="2025-10-30T00:23:07.535702032Z" level=info msg="Container c9a59925d332df64cee231e19b545c2057a0612b5d10917a34d449e43314d76d: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:23:07.538934 containerd[1627]: time="2025-10-30T00:23:07.538867460Z" level=info msg="CreateContainer within sandbox \"67abb50ec8c6b325b1b0c536bf508b9da83ac3556ce0037be0292dabb553f192\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c9a59925d332df64cee231e19b545c2057a0612b5d10917a34d449e43314d76d\"" Oct 30 00:23:07.539374 containerd[1627]: time="2025-10-30T00:23:07.539359648Z" level=info msg="StartContainer for \"c9a59925d332df64cee231e19b545c2057a0612b5d10917a34d449e43314d76d\"" Oct 30 00:23:07.540123 containerd[1627]: time="2025-10-30T00:23:07.540108102Z" level=info msg="connecting to shim c9a59925d332df64cee231e19b545c2057a0612b5d10917a34d449e43314d76d" address="unix:///run/containerd/s/f1e056e7b58642cf306618669b14106a06ad4a5b324f2a8abb638c303bc98532" protocol=ttrpc version=3 Oct 30 00:23:07.557187 systemd[1]: Started cri-containerd-c9a59925d332df64cee231e19b545c2057a0612b5d10917a34d449e43314d76d.scope - libcontainer container c9a59925d332df64cee231e19b545c2057a0612b5d10917a34d449e43314d76d. Oct 30 00:23:07.590675 containerd[1627]: time="2025-10-30T00:23:07.590653064Z" level=info msg="StartContainer for \"c9a59925d332df64cee231e19b545c2057a0612b5d10917a34d449e43314d76d\" returns successfully" Oct 30 00:23:08.223339 kubelet[2936]: I1030 00:23:08.223289 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-965s2" podStartSLOduration=2.223265414 podStartE2EDuration="2.223265414s" podCreationTimestamp="2025-10-30 00:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-30 00:23:08.223238542 +0000 UTC m=+7.195278337" watchObservedRunningTime="2025-10-30 00:23:08.223265414 +0000 UTC m=+7.195305206" Oct 30 00:23:09.084407 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2119230470.mount: Deactivated successfully. Oct 30 00:23:09.623564 containerd[1627]: time="2025-10-30T00:23:09.623518126Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:09.624019 containerd[1627]: time="2025-10-30T00:23:09.623977789Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Oct 30 00:23:09.624345 containerd[1627]: time="2025-10-30T00:23:09.624329313Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:09.625252 containerd[1627]: time="2025-10-30T00:23:09.625235676Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:09.625688 containerd[1627]: time="2025-10-30T00:23:09.625672034Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.162759354s" Oct 30 00:23:09.625739 containerd[1627]: time="2025-10-30T00:23:09.625731108Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Oct 30 00:23:09.627820 containerd[1627]: time="2025-10-30T00:23:09.627789425Z" level=info msg="CreateContainer within sandbox \"4a9c990228b2bf4e457d60e39739d4fa9751644f64979bb4ccf2da77c9dc1079\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Oct 30 00:23:09.632293 containerd[1627]: time="2025-10-30T00:23:09.631046676Z" level=info msg="Container d2d35e38fb3a8e83ea00a4bf426dd4fd78a2f4a6f7d9530e4450536748c9ec47: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:23:09.662372 containerd[1627]: time="2025-10-30T00:23:09.662337138Z" level=info msg="CreateContainer within sandbox \"4a9c990228b2bf4e457d60e39739d4fa9751644f64979bb4ccf2da77c9dc1079\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d2d35e38fb3a8e83ea00a4bf426dd4fd78a2f4a6f7d9530e4450536748c9ec47\"" Oct 30 00:23:09.663443 containerd[1627]: time="2025-10-30T00:23:09.663422796Z" level=info msg="StartContainer for \"d2d35e38fb3a8e83ea00a4bf426dd4fd78a2f4a6f7d9530e4450536748c9ec47\"" Oct 30 00:23:09.664326 containerd[1627]: time="2025-10-30T00:23:09.664297846Z" level=info msg="connecting to shim d2d35e38fb3a8e83ea00a4bf426dd4fd78a2f4a6f7d9530e4450536748c9ec47" address="unix:///run/containerd/s/39dccd52a16f343affe471f65d6f8513a099101700ac6e39ad8162af13dd9555" protocol=ttrpc version=3 Oct 30 00:23:09.685258 systemd[1]: Started cri-containerd-d2d35e38fb3a8e83ea00a4bf426dd4fd78a2f4a6f7d9530e4450536748c9ec47.scope - libcontainer container d2d35e38fb3a8e83ea00a4bf426dd4fd78a2f4a6f7d9530e4450536748c9ec47. Oct 30 00:23:09.705549 containerd[1627]: time="2025-10-30T00:23:09.705492202Z" level=info msg="StartContainer for \"d2d35e38fb3a8e83ea00a4bf426dd4fd78a2f4a6f7d9530e4450536748c9ec47\" returns successfully" Oct 30 00:23:10.223090 kubelet[2936]: I1030 00:23:10.222902 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-5mzrl" podStartSLOduration=1.059269912 podStartE2EDuration="3.222890024s" podCreationTimestamp="2025-10-30 00:23:07 +0000 UTC" firstStartedPulling="2025-10-30 00:23:07.462545217 +0000 UTC m=+6.434585012" lastFinishedPulling="2025-10-30 00:23:09.626165331 +0000 UTC m=+8.598205124" observedRunningTime="2025-10-30 00:23:10.222650864 +0000 UTC m=+9.194690669" watchObservedRunningTime="2025-10-30 00:23:10.222890024 +0000 UTC m=+9.194929821" Oct 30 00:23:15.111791 sudo[1947]: pam_unix(sudo:session): session closed for user root Oct 30 00:23:15.118869 sshd[1946]: Connection closed by 139.178.89.65 port 57770 Oct 30 00:23:15.123293 sshd-session[1943]: pam_unix(sshd:session): session closed for user core Oct 30 00:23:15.125676 systemd[1]: sshd@6-139.178.70.100:22-139.178.89.65:57770.service: Deactivated successfully. Oct 30 00:23:15.127733 systemd[1]: session-9.scope: Deactivated successfully. Oct 30 00:23:15.127912 systemd[1]: session-9.scope: Consumed 3.424s CPU time, 154.7M memory peak. Oct 30 00:23:15.129697 systemd-logind[1606]: Session 9 logged out. Waiting for processes to exit. Oct 30 00:23:15.130735 systemd-logind[1606]: Removed session 9. Oct 30 00:23:20.531686 systemd[1]: Created slice kubepods-besteffort-pod37feeea0_9f87_4048_83f7_9136c40a4413.slice - libcontainer container kubepods-besteffort-pod37feeea0_9f87_4048_83f7_9136c40a4413.slice. Oct 30 00:23:20.588228 kubelet[2936]: I1030 00:23:20.588193 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37feeea0-9f87-4048-83f7-9136c40a4413-tigera-ca-bundle\") pod \"calico-typha-855f6df89-jqfs2\" (UID: \"37feeea0-9f87-4048-83f7-9136c40a4413\") " pod="calico-system/calico-typha-855f6df89-jqfs2" Oct 30 00:23:20.588228 kubelet[2936]: I1030 00:23:20.588227 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdkgj\" (UniqueName: \"kubernetes.io/projected/37feeea0-9f87-4048-83f7-9136c40a4413-kube-api-access-fdkgj\") pod \"calico-typha-855f6df89-jqfs2\" (UID: \"37feeea0-9f87-4048-83f7-9136c40a4413\") " pod="calico-system/calico-typha-855f6df89-jqfs2" Oct 30 00:23:20.588526 kubelet[2936]: I1030 00:23:20.588242 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/37feeea0-9f87-4048-83f7-9136c40a4413-typha-certs\") pod \"calico-typha-855f6df89-jqfs2\" (UID: \"37feeea0-9f87-4048-83f7-9136c40a4413\") " pod="calico-system/calico-typha-855f6df89-jqfs2" Oct 30 00:23:20.758837 systemd[1]: Created slice kubepods-besteffort-pod49b65c28_6154_48e5_9626_2a88861485a7.slice - libcontainer container kubepods-besteffort-pod49b65c28_6154_48e5_9626_2a88861485a7.slice. Oct 30 00:23:20.789859 kubelet[2936]: I1030 00:23:20.789643 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-cni-bin-dir\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.789859 kubelet[2936]: I1030 00:23:20.789686 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-var-lib-calico\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.789859 kubelet[2936]: I1030 00:23:20.789735 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kjps\" (UniqueName: \"kubernetes.io/projected/49b65c28-6154-48e5-9626-2a88861485a7-kube-api-access-4kjps\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.789859 kubelet[2936]: I1030 00:23:20.789749 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/49b65c28-6154-48e5-9626-2a88861485a7-node-certs\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.789859 kubelet[2936]: I1030 00:23:20.789763 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-policysync\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.790027 kubelet[2936]: I1030 00:23:20.789776 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-var-run-calico\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.790027 kubelet[2936]: I1030 00:23:20.789791 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-xtables-lock\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.790027 kubelet[2936]: I1030 00:23:20.789809 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49b65c28-6154-48e5-9626-2a88861485a7-tigera-ca-bundle\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.790027 kubelet[2936]: I1030 00:23:20.789841 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-cni-log-dir\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.790027 kubelet[2936]: I1030 00:23:20.789860 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-cni-net-dir\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.790853 kubelet[2936]: I1030 00:23:20.789877 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-flexvol-driver-host\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.790853 kubelet[2936]: I1030 00:23:20.789891 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49b65c28-6154-48e5-9626-2a88861485a7-lib-modules\") pod \"calico-node-9vl72\" (UID: \"49b65c28-6154-48e5-9626-2a88861485a7\") " pod="calico-system/calico-node-9vl72" Oct 30 00:23:20.854319 containerd[1627]: time="2025-10-30T00:23:20.854292705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-855f6df89-jqfs2,Uid:37feeea0-9f87-4048-83f7-9136c40a4413,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:20.989714 containerd[1627]: time="2025-10-30T00:23:20.989657790Z" level=info msg="connecting to shim e1d5e77a7a72515c8caf02b614e8f0c0a7669ce0359c436d95d51b7fedbdb3d5" address="unix:///run/containerd/s/a2e5e6c84910350298bd334882e5380a8347b2c5d5d83230e9989628ecff4b6d" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:21.010156 systemd[1]: Started cri-containerd-e1d5e77a7a72515c8caf02b614e8f0c0a7669ce0359c436d95d51b7fedbdb3d5.scope - libcontainer container e1d5e77a7a72515c8caf02b614e8f0c0a7669ce0359c436d95d51b7fedbdb3d5. Oct 30 00:23:21.045104 kubelet[2936]: E1030 00:23:21.044992 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:21.066389 containerd[1627]: time="2025-10-30T00:23:21.066355216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9vl72,Uid:49b65c28-6154-48e5-9626-2a88861485a7,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:21.069597 containerd[1627]: time="2025-10-30T00:23:21.069569255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-855f6df89-jqfs2,Uid:37feeea0-9f87-4048-83f7-9136c40a4413,Namespace:calico-system,Attempt:0,} returns sandbox id \"e1d5e77a7a72515c8caf02b614e8f0c0a7669ce0359c436d95d51b7fedbdb3d5\"" Oct 30 00:23:21.073254 containerd[1627]: time="2025-10-30T00:23:21.073084371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Oct 30 00:23:21.077664 kubelet[2936]: E1030 00:23:21.077631 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.083042 kubelet[2936]: W1030 00:23:21.077775 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.085631 kubelet[2936]: E1030 00:23:21.085525 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.087301 kubelet[2936]: E1030 00:23:21.087277 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.087385 kubelet[2936]: W1030 00:23:21.087363 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.087432 kubelet[2936]: E1030 00:23:21.087424 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.087679 kubelet[2936]: E1030 00:23:21.087630 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.087872 kubelet[2936]: W1030 00:23:21.087732 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.087872 kubelet[2936]: E1030 00:23:21.087741 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.087971 kubelet[2936]: E1030 00:23:21.087964 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.088021 kubelet[2936]: W1030 00:23:21.088014 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.088083 kubelet[2936]: E1030 00:23:21.088073 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.088327 kubelet[2936]: E1030 00:23:21.088257 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.088327 kubelet[2936]: W1030 00:23:21.088267 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.088327 kubelet[2936]: E1030 00:23:21.088276 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.088427 kubelet[2936]: E1030 00:23:21.088420 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.088513 kubelet[2936]: W1030 00:23:21.088465 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.088513 kubelet[2936]: E1030 00:23:21.088474 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.088648 kubelet[2936]: E1030 00:23:21.088600 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.088648 kubelet[2936]: W1030 00:23:21.088606 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.088648 kubelet[2936]: E1030 00:23:21.088612 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.088785 kubelet[2936]: E1030 00:23:21.088778 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.088876 kubelet[2936]: W1030 00:23:21.088823 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.088876 kubelet[2936]: E1030 00:23:21.088832 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.089045 kubelet[2936]: E1030 00:23:21.088999 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.089045 kubelet[2936]: W1030 00:23:21.089007 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.089045 kubelet[2936]: E1030 00:23:21.089014 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.089256 kubelet[2936]: E1030 00:23:21.089206 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.089256 kubelet[2936]: W1030 00:23:21.089212 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.089256 kubelet[2936]: E1030 00:23:21.089218 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.089400 kubelet[2936]: E1030 00:23:21.089393 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.089493 kubelet[2936]: W1030 00:23:21.089443 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.089493 kubelet[2936]: E1030 00:23:21.089451 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.089640 kubelet[2936]: E1030 00:23:21.089581 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.089640 kubelet[2936]: W1030 00:23:21.089587 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.089640 kubelet[2936]: E1030 00:23:21.089592 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.089782 kubelet[2936]: E1030 00:23:21.089774 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.089824 kubelet[2936]: W1030 00:23:21.089818 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.089862 kubelet[2936]: E1030 00:23:21.089854 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.090051 kubelet[2936]: E1030 00:23:21.090005 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.090051 kubelet[2936]: W1030 00:23:21.090014 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.090051 kubelet[2936]: E1030 00:23:21.090022 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.090310 kubelet[2936]: E1030 00:23:21.090236 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.090310 kubelet[2936]: W1030 00:23:21.090245 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.090310 kubelet[2936]: E1030 00:23:21.090253 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.090450 kubelet[2936]: E1030 00:23:21.090442 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.090501 kubelet[2936]: W1030 00:23:21.090492 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.090542 kubelet[2936]: E1030 00:23:21.090537 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.091063 kubelet[2936]: E1030 00:23:21.090704 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.091117 kubelet[2936]: W1030 00:23:21.091107 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.091166 kubelet[2936]: E1030 00:23:21.091158 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.091714 kubelet[2936]: E1030 00:23:21.091330 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.091714 kubelet[2936]: W1030 00:23:21.091338 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.091714 kubelet[2936]: E1030 00:23:21.091345 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.091991 kubelet[2936]: E1030 00:23:21.091860 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.091991 kubelet[2936]: W1030 00:23:21.091870 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.091991 kubelet[2936]: E1030 00:23:21.091879 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.092157 kubelet[2936]: E1030 00:23:21.092149 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.092316 kubelet[2936]: W1030 00:23:21.092194 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.092316 kubelet[2936]: E1030 00:23:21.092225 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.092596 kubelet[2936]: E1030 00:23:21.092573 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.092674 kubelet[2936]: W1030 00:23:21.092582 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.092674 kubelet[2936]: E1030 00:23:21.092644 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.092741 kubelet[2936]: I1030 00:23:21.092722 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54feb653-7cf9-4ac1-9a8d-a45c98b9a230-registration-dir\") pod \"csi-node-driver-7hzkb\" (UID: \"54feb653-7cf9-4ac1-9a8d-a45c98b9a230\") " pod="calico-system/csi-node-driver-7hzkb" Oct 30 00:23:21.092849 kubelet[2936]: E1030 00:23:21.092834 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.092849 kubelet[2936]: W1030 00:23:21.092846 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.092896 kubelet[2936]: E1030 00:23:21.092860 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.092959 kubelet[2936]: E1030 00:23:21.092950 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.092985 kubelet[2936]: W1030 00:23:21.092957 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.093084 kubelet[2936]: E1030 00:23:21.092973 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.093118 kubelet[2936]: E1030 00:23:21.093097 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.093118 kubelet[2936]: W1030 00:23:21.093104 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.093118 kubelet[2936]: E1030 00:23:21.093110 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.093206 kubelet[2936]: I1030 00:23:21.093132 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8rnb\" (UniqueName: \"kubernetes.io/projected/54feb653-7cf9-4ac1-9a8d-a45c98b9a230-kube-api-access-r8rnb\") pod \"csi-node-driver-7hzkb\" (UID: \"54feb653-7cf9-4ac1-9a8d-a45c98b9a230\") " pod="calico-system/csi-node-driver-7hzkb" Oct 30 00:23:21.093231 kubelet[2936]: E1030 00:23:21.093226 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.093253 kubelet[2936]: W1030 00:23:21.093232 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.093253 kubelet[2936]: E1030 00:23:21.093237 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.093253 kubelet[2936]: I1030 00:23:21.093248 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54feb653-7cf9-4ac1-9a8d-a45c98b9a230-kubelet-dir\") pod \"csi-node-driver-7hzkb\" (UID: \"54feb653-7cf9-4ac1-9a8d-a45c98b9a230\") " pod="calico-system/csi-node-driver-7hzkb" Oct 30 00:23:21.093548 kubelet[2936]: E1030 00:23:21.093537 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.093548 kubelet[2936]: W1030 00:23:21.093546 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.093584 kubelet[2936]: E1030 00:23:21.093553 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.093584 kubelet[2936]: I1030 00:23:21.093566 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54feb653-7cf9-4ac1-9a8d-a45c98b9a230-socket-dir\") pod \"csi-node-driver-7hzkb\" (UID: \"54feb653-7cf9-4ac1-9a8d-a45c98b9a230\") " pod="calico-system/csi-node-driver-7hzkb" Oct 30 00:23:21.093673 kubelet[2936]: E1030 00:23:21.093662 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.093673 kubelet[2936]: W1030 00:23:21.093671 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.093710 kubelet[2936]: E1030 00:23:21.093692 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.093710 kubelet[2936]: I1030 00:23:21.093707 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/54feb653-7cf9-4ac1-9a8d-a45c98b9a230-varrun\") pod \"csi-node-driver-7hzkb\" (UID: \"54feb653-7cf9-4ac1-9a8d-a45c98b9a230\") " pod="calico-system/csi-node-driver-7hzkb" Oct 30 00:23:21.093837 kubelet[2936]: E1030 00:23:21.093826 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.093837 kubelet[2936]: W1030 00:23:21.093834 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.093884 kubelet[2936]: E1030 00:23:21.093840 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.093925 kubelet[2936]: E1030 00:23:21.093914 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.093925 kubelet[2936]: W1030 00:23:21.093921 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.093970 kubelet[2936]: E1030 00:23:21.093926 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.094039 kubelet[2936]: E1030 00:23:21.094018 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.094074 kubelet[2936]: W1030 00:23:21.094026 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.094074 kubelet[2936]: E1030 00:23:21.094053 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.094140 kubelet[2936]: E1030 00:23:21.094132 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.094140 kubelet[2936]: W1030 00:23:21.094138 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.094185 kubelet[2936]: E1030 00:23:21.094143 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.094246 kubelet[2936]: E1030 00:23:21.094238 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.094246 kubelet[2936]: W1030 00:23:21.094245 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.094343 kubelet[2936]: E1030 00:23:21.094249 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.094343 kubelet[2936]: E1030 00:23:21.094328 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.094343 kubelet[2936]: W1030 00:23:21.094333 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.094343 kubelet[2936]: E1030 00:23:21.094337 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.094423 kubelet[2936]: E1030 00:23:21.094412 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.094423 kubelet[2936]: W1030 00:23:21.094419 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.094423 kubelet[2936]: E1030 00:23:21.094424 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.094526 kubelet[2936]: E1030 00:23:21.094493 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.094526 kubelet[2936]: W1030 00:23:21.094498 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.094526 kubelet[2936]: E1030 00:23:21.094502 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.194506 kubelet[2936]: E1030 00:23:21.194485 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.194506 kubelet[2936]: W1030 00:23:21.194499 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.194506 kubelet[2936]: E1030 00:23:21.194512 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.194752 kubelet[2936]: E1030 00:23:21.194610 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.194752 kubelet[2936]: W1030 00:23:21.194615 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.194752 kubelet[2936]: E1030 00:23:21.194620 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.194894 kubelet[2936]: E1030 00:23:21.194846 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.194894 kubelet[2936]: W1030 00:23:21.194855 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.194894 kubelet[2936]: E1030 00:23:21.194862 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.195097 kubelet[2936]: E1030 00:23:21.195062 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.195097 kubelet[2936]: W1030 00:23:21.195069 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.195097 kubelet[2936]: E1030 00:23:21.195074 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.195330 kubelet[2936]: E1030 00:23:21.195268 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.195330 kubelet[2936]: W1030 00:23:21.195274 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.195330 kubelet[2936]: E1030 00:23:21.195280 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.195551 kubelet[2936]: E1030 00:23:21.195480 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.195551 kubelet[2936]: W1030 00:23:21.195486 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.195551 kubelet[2936]: E1030 00:23:21.195491 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.195665 kubelet[2936]: E1030 00:23:21.195619 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.195665 kubelet[2936]: W1030 00:23:21.195626 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.195665 kubelet[2936]: E1030 00:23:21.195632 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.195731 kubelet[2936]: E1030 00:23:21.195715 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.195731 kubelet[2936]: W1030 00:23:21.195720 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.195731 kubelet[2936]: E1030 00:23:21.195724 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.195925 kubelet[2936]: E1030 00:23:21.195810 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.195925 kubelet[2936]: W1030 00:23:21.195817 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.195925 kubelet[2936]: E1030 00:23:21.195823 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.196094 kubelet[2936]: E1030 00:23:21.195999 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.196094 kubelet[2936]: W1030 00:23:21.196004 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.196094 kubelet[2936]: E1030 00:23:21.196010 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.196253 kubelet[2936]: E1030 00:23:21.196242 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.196322 kubelet[2936]: W1030 00:23:21.196283 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.196322 kubelet[2936]: E1030 00:23:21.196291 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.196536 kubelet[2936]: E1030 00:23:21.196467 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.196536 kubelet[2936]: W1030 00:23:21.196473 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.196536 kubelet[2936]: E1030 00:23:21.196478 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.196661 kubelet[2936]: E1030 00:23:21.196647 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.196661 kubelet[2936]: W1030 00:23:21.196657 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.196711 kubelet[2936]: E1030 00:23:21.196664 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.196831 kubelet[2936]: E1030 00:23:21.196820 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.196831 kubelet[2936]: W1030 00:23:21.196827 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.196896 kubelet[2936]: E1030 00:23:21.196832 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.196999 kubelet[2936]: E1030 00:23:21.196987 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.196999 kubelet[2936]: W1030 00:23:21.196996 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.197076 kubelet[2936]: E1030 00:23:21.197002 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.197207 kubelet[2936]: E1030 00:23:21.197199 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.197207 kubelet[2936]: W1030 00:23:21.197206 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.197265 kubelet[2936]: E1030 00:23:21.197212 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.197400 kubelet[2936]: E1030 00:23:21.197388 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.197400 kubelet[2936]: W1030 00:23:21.197396 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.197443 kubelet[2936]: E1030 00:23:21.197401 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.197507 kubelet[2936]: E1030 00:23:21.197499 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.197507 kubelet[2936]: W1030 00:23:21.197506 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.197565 kubelet[2936]: E1030 00:23:21.197511 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.197613 kubelet[2936]: E1030 00:23:21.197600 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.197613 kubelet[2936]: W1030 00:23:21.197607 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.197613 kubelet[2936]: E1030 00:23:21.197611 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.197700 kubelet[2936]: E1030 00:23:21.197689 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.197700 kubelet[2936]: W1030 00:23:21.197697 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.197753 kubelet[2936]: E1030 00:23:21.197702 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.197815 kubelet[2936]: E1030 00:23:21.197804 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.197815 kubelet[2936]: W1030 00:23:21.197812 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.197857 kubelet[2936]: E1030 00:23:21.197817 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.197913 kubelet[2936]: E1030 00:23:21.197905 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.197913 kubelet[2936]: W1030 00:23:21.197911 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.197965 kubelet[2936]: E1030 00:23:21.197916 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.198006 kubelet[2936]: E1030 00:23:21.197995 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.198076 kubelet[2936]: W1030 00:23:21.198014 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.198076 kubelet[2936]: E1030 00:23:21.198020 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.198142 kubelet[2936]: E1030 00:23:21.198131 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.198142 kubelet[2936]: W1030 00:23:21.198137 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.198183 kubelet[2936]: E1030 00:23:21.198142 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.206273 kubelet[2936]: E1030 00:23:21.205182 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.206273 kubelet[2936]: W1030 00:23:21.205195 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.206273 kubelet[2936]: E1030 00:23:21.205235 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.206423 kubelet[2936]: E1030 00:23:21.206393 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:21.206423 kubelet[2936]: W1030 00:23:21.206403 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:21.206423 kubelet[2936]: E1030 00:23:21.206411 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:21.230554 containerd[1627]: time="2025-10-30T00:23:21.230486840Z" level=info msg="connecting to shim d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a" address="unix:///run/containerd/s/20a415735b9291a077959f7501f6b9f9785c8518a7c0a04c5b5ba1a0816571a2" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:21.254173 systemd[1]: Started cri-containerd-d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a.scope - libcontainer container d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a. Oct 30 00:23:21.297971 containerd[1627]: time="2025-10-30T00:23:21.297873282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9vl72,Uid:49b65c28-6154-48e5-9626-2a88861485a7,Namespace:calico-system,Attempt:0,} returns sandbox id \"d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a\"" Oct 30 00:23:22.166950 kubelet[2936]: E1030 00:23:22.166879 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:23.013386 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2797721841.mount: Deactivated successfully. Oct 30 00:23:24.075236 containerd[1627]: time="2025-10-30T00:23:24.075128009Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:24.081294 containerd[1627]: time="2025-10-30T00:23:24.081269863Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Oct 30 00:23:24.088160 containerd[1627]: time="2025-10-30T00:23:24.088119162Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:24.095409 containerd[1627]: time="2025-10-30T00:23:24.095370316Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:24.096016 containerd[1627]: time="2025-10-30T00:23:24.095741102Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 3.022626702s" Oct 30 00:23:24.096016 containerd[1627]: time="2025-10-30T00:23:24.095769769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Oct 30 00:23:24.096881 containerd[1627]: time="2025-10-30T00:23:24.096837433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Oct 30 00:23:24.112931 containerd[1627]: time="2025-10-30T00:23:24.112907320Z" level=info msg="CreateContainer within sandbox \"e1d5e77a7a72515c8caf02b614e8f0c0a7669ce0359c436d95d51b7fedbdb3d5\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Oct 30 00:23:24.118449 containerd[1627]: time="2025-10-30T00:23:24.118349861Z" level=info msg="Container 749d9fc2183922224b910e891a30bc9d913207756e784f3330b4c39fa03e5a00: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:23:24.120554 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount451504129.mount: Deactivated successfully. Oct 30 00:23:24.125191 containerd[1627]: time="2025-10-30T00:23:24.125118970Z" level=info msg="CreateContainer within sandbox \"e1d5e77a7a72515c8caf02b614e8f0c0a7669ce0359c436d95d51b7fedbdb3d5\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"749d9fc2183922224b910e891a30bc9d913207756e784f3330b4c39fa03e5a00\"" Oct 30 00:23:24.125526 containerd[1627]: time="2025-10-30T00:23:24.125509700Z" level=info msg="StartContainer for \"749d9fc2183922224b910e891a30bc9d913207756e784f3330b4c39fa03e5a00\"" Oct 30 00:23:24.128918 containerd[1627]: time="2025-10-30T00:23:24.128848052Z" level=info msg="connecting to shim 749d9fc2183922224b910e891a30bc9d913207756e784f3330b4c39fa03e5a00" address="unix:///run/containerd/s/a2e5e6c84910350298bd334882e5380a8347b2c5d5d83230e9989628ecff4b6d" protocol=ttrpc version=3 Oct 30 00:23:24.166195 kubelet[2936]: E1030 00:23:24.166164 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:24.170171 systemd[1]: Started cri-containerd-749d9fc2183922224b910e891a30bc9d913207756e784f3330b4c39fa03e5a00.scope - libcontainer container 749d9fc2183922224b910e891a30bc9d913207756e784f3330b4c39fa03e5a00. Oct 30 00:23:24.212756 containerd[1627]: time="2025-10-30T00:23:24.212674300Z" level=info msg="StartContainer for \"749d9fc2183922224b910e891a30bc9d913207756e784f3330b4c39fa03e5a00\" returns successfully" Oct 30 00:23:24.311746 kubelet[2936]: E1030 00:23:24.311717 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.311746 kubelet[2936]: W1030 00:23:24.311733 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.311746 kubelet[2936]: E1030 00:23:24.311746 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.311936 kubelet[2936]: E1030 00:23:24.311920 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.311936 kubelet[2936]: W1030 00:23:24.311928 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.311936 kubelet[2936]: E1030 00:23:24.311934 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.312767 kubelet[2936]: E1030 00:23:24.312007 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.312767 kubelet[2936]: W1030 00:23:24.312011 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.312767 kubelet[2936]: E1030 00:23:24.312027 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.312767 kubelet[2936]: E1030 00:23:24.312234 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.312767 kubelet[2936]: W1030 00:23:24.312241 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.312767 kubelet[2936]: E1030 00:23:24.312246 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.312767 kubelet[2936]: E1030 00:23:24.312398 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.312767 kubelet[2936]: W1030 00:23:24.312403 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.312767 kubelet[2936]: E1030 00:23:24.312409 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.312767 kubelet[2936]: E1030 00:23:24.312482 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.323788 kubelet[2936]: W1030 00:23:24.312486 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.323788 kubelet[2936]: E1030 00:23:24.312491 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.323788 kubelet[2936]: E1030 00:23:24.312628 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.323788 kubelet[2936]: W1030 00:23:24.312633 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.323788 kubelet[2936]: E1030 00:23:24.312638 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.323788 kubelet[2936]: E1030 00:23:24.312713 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.323788 kubelet[2936]: W1030 00:23:24.312718 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.323788 kubelet[2936]: E1030 00:23:24.312722 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.323788 kubelet[2936]: E1030 00:23:24.312871 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.323788 kubelet[2936]: W1030 00:23:24.312876 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330334 kubelet[2936]: E1030 00:23:24.312883 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330334 kubelet[2936]: E1030 00:23:24.312952 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330334 kubelet[2936]: W1030 00:23:24.312957 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330334 kubelet[2936]: E1030 00:23:24.312962 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330334 kubelet[2936]: E1030 00:23:24.313106 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330334 kubelet[2936]: W1030 00:23:24.313112 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330334 kubelet[2936]: E1030 00:23:24.313117 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330334 kubelet[2936]: E1030 00:23:24.313192 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330334 kubelet[2936]: W1030 00:23:24.313197 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330334 kubelet[2936]: E1030 00:23:24.313202 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330630 kubelet[2936]: E1030 00:23:24.313358 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330630 kubelet[2936]: W1030 00:23:24.313363 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330630 kubelet[2936]: E1030 00:23:24.313368 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330630 kubelet[2936]: E1030 00:23:24.313443 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330630 kubelet[2936]: W1030 00:23:24.313448 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330630 kubelet[2936]: E1030 00:23:24.313452 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330630 kubelet[2936]: E1030 00:23:24.313529 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330630 kubelet[2936]: W1030 00:23:24.313533 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330630 kubelet[2936]: E1030 00:23:24.313537 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330630 kubelet[2936]: E1030 00:23:24.316519 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330795 kubelet[2936]: W1030 00:23:24.316530 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330795 kubelet[2936]: E1030 00:23:24.316540 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330795 kubelet[2936]: E1030 00:23:24.316619 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330795 kubelet[2936]: W1030 00:23:24.316625 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330795 kubelet[2936]: E1030 00:23:24.316630 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330795 kubelet[2936]: E1030 00:23:24.316785 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330795 kubelet[2936]: W1030 00:23:24.316791 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330795 kubelet[2936]: E1030 00:23:24.316797 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330795 kubelet[2936]: E1030 00:23:24.317090 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330795 kubelet[2936]: W1030 00:23:24.317096 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330958 kubelet[2936]: E1030 00:23:24.317102 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330958 kubelet[2936]: E1030 00:23:24.317289 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330958 kubelet[2936]: W1030 00:23:24.317293 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330958 kubelet[2936]: E1030 00:23:24.317298 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330958 kubelet[2936]: E1030 00:23:24.317400 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330958 kubelet[2936]: W1030 00:23:24.317405 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330958 kubelet[2936]: E1030 00:23:24.317411 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.330958 kubelet[2936]: E1030 00:23:24.317722 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.330958 kubelet[2936]: W1030 00:23:24.317728 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.330958 kubelet[2936]: E1030 00:23:24.317733 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331124 kubelet[2936]: E1030 00:23:24.318057 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331124 kubelet[2936]: W1030 00:23:24.318063 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331124 kubelet[2936]: E1030 00:23:24.318069 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331124 kubelet[2936]: E1030 00:23:24.318708 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331124 kubelet[2936]: W1030 00:23:24.318714 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331124 kubelet[2936]: E1030 00:23:24.318720 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331124 kubelet[2936]: E1030 00:23:24.318844 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331124 kubelet[2936]: W1030 00:23:24.318849 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331124 kubelet[2936]: E1030 00:23:24.318857 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331124 kubelet[2936]: E1030 00:23:24.318937 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331288 kubelet[2936]: W1030 00:23:24.318942 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331288 kubelet[2936]: E1030 00:23:24.318948 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331288 kubelet[2936]: E1030 00:23:24.319105 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331288 kubelet[2936]: W1030 00:23:24.319110 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331288 kubelet[2936]: E1030 00:23:24.319116 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331288 kubelet[2936]: E1030 00:23:24.319481 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331288 kubelet[2936]: W1030 00:23:24.319487 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331288 kubelet[2936]: E1030 00:23:24.319493 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331288 kubelet[2936]: E1030 00:23:24.319647 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331288 kubelet[2936]: W1030 00:23:24.319652 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331442 kubelet[2936]: E1030 00:23:24.319657 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331442 kubelet[2936]: E1030 00:23:24.319734 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331442 kubelet[2936]: W1030 00:23:24.319739 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331442 kubelet[2936]: E1030 00:23:24.319744 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331442 kubelet[2936]: E1030 00:23:24.319898 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331442 kubelet[2936]: W1030 00:23:24.319902 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331442 kubelet[2936]: E1030 00:23:24.319907 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331442 kubelet[2936]: E1030 00:23:24.320066 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331442 kubelet[2936]: W1030 00:23:24.320072 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331442 kubelet[2936]: E1030 00:23:24.320077 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:24.331599 kubelet[2936]: E1030 00:23:24.321516 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:24.331599 kubelet[2936]: W1030 00:23:24.321524 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:24.331599 kubelet[2936]: E1030 00:23:24.321531 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.293056 kubelet[2936]: I1030 00:23:25.292886 2936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 30 00:23:25.319968 kubelet[2936]: E1030 00:23:25.319934 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.319968 kubelet[2936]: W1030 00:23:25.319953 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.319968 kubelet[2936]: E1030 00:23:25.319966 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320173 kubelet[2936]: E1030 00:23:25.320113 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320173 kubelet[2936]: W1030 00:23:25.320118 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320173 kubelet[2936]: E1030 00:23:25.320124 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320263 kubelet[2936]: E1030 00:23:25.320232 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320263 kubelet[2936]: W1030 00:23:25.320239 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320263 kubelet[2936]: E1030 00:23:25.320247 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320344 kubelet[2936]: E1030 00:23:25.320331 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320344 kubelet[2936]: W1030 00:23:25.320336 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320344 kubelet[2936]: E1030 00:23:25.320341 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320440 kubelet[2936]: E1030 00:23:25.320425 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320440 kubelet[2936]: W1030 00:23:25.320437 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320498 kubelet[2936]: E1030 00:23:25.320444 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320541 kubelet[2936]: E1030 00:23:25.320528 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320541 kubelet[2936]: W1030 00:23:25.320532 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320541 kubelet[2936]: E1030 00:23:25.320537 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320636 kubelet[2936]: E1030 00:23:25.320615 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320636 kubelet[2936]: W1030 00:23:25.320620 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320636 kubelet[2936]: E1030 00:23:25.320627 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320732 kubelet[2936]: E1030 00:23:25.320706 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320732 kubelet[2936]: W1030 00:23:25.320711 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320732 kubelet[2936]: E1030 00:23:25.320715 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320831 kubelet[2936]: E1030 00:23:25.320803 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320831 kubelet[2936]: W1030 00:23:25.320809 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320831 kubelet[2936]: E1030 00:23:25.320816 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.320935 kubelet[2936]: E1030 00:23:25.320891 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.320935 kubelet[2936]: W1030 00:23:25.320896 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.320935 kubelet[2936]: E1030 00:23:25.320904 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.321114 kubelet[2936]: E1030 00:23:25.320977 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.321114 kubelet[2936]: W1030 00:23:25.320981 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.321114 kubelet[2936]: E1030 00:23:25.320986 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.321114 kubelet[2936]: E1030 00:23:25.321098 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.321114 kubelet[2936]: W1030 00:23:25.321103 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.321114 kubelet[2936]: E1030 00:23:25.321109 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.321373 kubelet[2936]: E1030 00:23:25.321203 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.321373 kubelet[2936]: W1030 00:23:25.321208 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.321373 kubelet[2936]: E1030 00:23:25.321215 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.321373 kubelet[2936]: E1030 00:23:25.321293 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.321373 kubelet[2936]: W1030 00:23:25.321298 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.321373 kubelet[2936]: E1030 00:23:25.321303 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.321573 kubelet[2936]: E1030 00:23:25.321410 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.321573 kubelet[2936]: W1030 00:23:25.321415 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.321573 kubelet[2936]: E1030 00:23:25.321420 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.323842 kubelet[2936]: E1030 00:23:25.323820 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.323842 kubelet[2936]: W1030 00:23:25.323830 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.323842 kubelet[2936]: E1030 00:23:25.323840 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.324139 kubelet[2936]: E1030 00:23:25.323986 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.324139 kubelet[2936]: W1030 00:23:25.323991 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.324139 kubelet[2936]: E1030 00:23:25.323997 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.324789 kubelet[2936]: E1030 00:23:25.324741 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.324950 kubelet[2936]: W1030 00:23:25.324860 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.324950 kubelet[2936]: E1030 00:23:25.324876 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.325206 kubelet[2936]: E1030 00:23:25.325133 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.325206 kubelet[2936]: W1030 00:23:25.325143 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.325206 kubelet[2936]: E1030 00:23:25.325152 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.325610 kubelet[2936]: E1030 00:23:25.325549 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.325610 kubelet[2936]: W1030 00:23:25.325556 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.325610 kubelet[2936]: E1030 00:23:25.325563 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.325848 kubelet[2936]: E1030 00:23:25.325841 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.326069 kubelet[2936]: W1030 00:23:25.326057 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.326199 kubelet[2936]: E1030 00:23:25.326128 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.326374 kubelet[2936]: E1030 00:23:25.326365 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.326551 kubelet[2936]: W1030 00:23:25.326448 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.326551 kubelet[2936]: E1030 00:23:25.326461 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.326743 kubelet[2936]: E1030 00:23:25.326729 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.326879 kubelet[2936]: W1030 00:23:25.326799 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.326879 kubelet[2936]: E1030 00:23:25.326812 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.327110 kubelet[2936]: E1030 00:23:25.327092 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.327110 kubelet[2936]: W1030 00:23:25.327102 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.327181 kubelet[2936]: E1030 00:23:25.327113 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.327245 kubelet[2936]: E1030 00:23:25.327196 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.327245 kubelet[2936]: W1030 00:23:25.327201 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.327245 kubelet[2936]: E1030 00:23:25.327206 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.327329 kubelet[2936]: E1030 00:23:25.327273 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.327329 kubelet[2936]: W1030 00:23:25.327280 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.327329 kubelet[2936]: E1030 00:23:25.327293 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.327414 kubelet[2936]: E1030 00:23:25.327379 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.327414 kubelet[2936]: W1030 00:23:25.327385 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.327414 kubelet[2936]: E1030 00:23:25.327394 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.327802 kubelet[2936]: E1030 00:23:25.327601 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.327802 kubelet[2936]: W1030 00:23:25.327610 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.327802 kubelet[2936]: E1030 00:23:25.327624 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.327802 kubelet[2936]: E1030 00:23:25.327763 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.327802 kubelet[2936]: W1030 00:23:25.327783 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.328102 kubelet[2936]: E1030 00:23:25.327807 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.328102 kubelet[2936]: E1030 00:23:25.327961 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.328102 kubelet[2936]: W1030 00:23:25.327967 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.328102 kubelet[2936]: E1030 00:23:25.327972 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.328297 kubelet[2936]: E1030 00:23:25.328144 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.328297 kubelet[2936]: W1030 00:23:25.328152 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.328297 kubelet[2936]: E1030 00:23:25.328171 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.328460 kubelet[2936]: E1030 00:23:25.328370 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.328460 kubelet[2936]: W1030 00:23:25.328377 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.328460 kubelet[2936]: E1030 00:23:25.328383 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.328692 kubelet[2936]: E1030 00:23:25.328586 2936 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 30 00:23:25.328692 kubelet[2936]: W1030 00:23:25.328594 2936 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 30 00:23:25.328692 kubelet[2936]: E1030 00:23:25.328603 2936 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 30 00:23:25.673961 containerd[1627]: time="2025-10-30T00:23:25.673891968Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:25.679123 containerd[1627]: time="2025-10-30T00:23:25.679044205Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Oct 30 00:23:25.689051 containerd[1627]: time="2025-10-30T00:23:25.688624751Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:25.695645 containerd[1627]: time="2025-10-30T00:23:25.695622849Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:25.696096 containerd[1627]: time="2025-10-30T00:23:25.695881750Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.599025993s" Oct 30 00:23:25.696096 containerd[1627]: time="2025-10-30T00:23:25.695900170Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Oct 30 00:23:25.712533 containerd[1627]: time="2025-10-30T00:23:25.712512860Z" level=info msg="CreateContainer within sandbox \"d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Oct 30 00:23:25.763586 containerd[1627]: time="2025-10-30T00:23:25.763561213Z" level=info msg="Container 573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:23:25.765976 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1719342123.mount: Deactivated successfully. Oct 30 00:23:25.780993 containerd[1627]: time="2025-10-30T00:23:25.780930614Z" level=info msg="CreateContainer within sandbox \"d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160\"" Oct 30 00:23:25.781878 containerd[1627]: time="2025-10-30T00:23:25.781389404Z" level=info msg="StartContainer for \"573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160\"" Oct 30 00:23:25.784148 containerd[1627]: time="2025-10-30T00:23:25.784106513Z" level=info msg="connecting to shim 573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160" address="unix:///run/containerd/s/20a415735b9291a077959f7501f6b9f9785c8518a7c0a04c5b5ba1a0816571a2" protocol=ttrpc version=3 Oct 30 00:23:25.810146 systemd[1]: Started cri-containerd-573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160.scope - libcontainer container 573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160. Oct 30 00:23:25.847467 containerd[1627]: time="2025-10-30T00:23:25.847447108Z" level=info msg="StartContainer for \"573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160\" returns successfully" Oct 30 00:23:25.853901 systemd[1]: cri-containerd-573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160.scope: Deactivated successfully. Oct 30 00:23:25.865119 containerd[1627]: time="2025-10-30T00:23:25.864976965Z" level=info msg="received exit event container_id:\"573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160\" id:\"573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160\" pid:3634 exited_at:{seconds:1761783805 nanos:856470653}" Oct 30 00:23:25.873949 containerd[1627]: time="2025-10-30T00:23:25.873071402Z" level=info msg="TaskExit event in podsandbox handler container_id:\"573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160\" id:\"573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160\" pid:3634 exited_at:{seconds:1761783805 nanos:856470653}" Oct 30 00:23:25.888241 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-573eb1eabf05833c53275ade728c282f83d7568903186358da57e204e2ccf160-rootfs.mount: Deactivated successfully. Oct 30 00:23:26.166081 kubelet[2936]: E1030 00:23:26.166016 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:26.197598 kubelet[2936]: I1030 00:23:26.197535 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-855f6df89-jqfs2" podStartSLOduration=3.172913229 podStartE2EDuration="6.197524523s" podCreationTimestamp="2025-10-30 00:23:20 +0000 UTC" firstStartedPulling="2025-10-30 00:23:21.071805506 +0000 UTC m=+20.043845300" lastFinishedPulling="2025-10-30 00:23:24.096416798 +0000 UTC m=+23.068456594" observedRunningTime="2025-10-30 00:23:24.321730316 +0000 UTC m=+23.293770120" watchObservedRunningTime="2025-10-30 00:23:26.197524523 +0000 UTC m=+25.169564321" Oct 30 00:23:26.297346 containerd[1627]: time="2025-10-30T00:23:26.297284202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Oct 30 00:23:28.166262 kubelet[2936]: E1030 00:23:28.166039 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:29.706931 containerd[1627]: time="2025-10-30T00:23:29.706890487Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:29.711283 containerd[1627]: time="2025-10-30T00:23:29.707277069Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Oct 30 00:23:29.711283 containerd[1627]: time="2025-10-30T00:23:29.707634759Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:29.711482 containerd[1627]: time="2025-10-30T00:23:29.708584215Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 3.411261784s" Oct 30 00:23:29.711482 containerd[1627]: time="2025-10-30T00:23:29.711433684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Oct 30 00:23:29.711829 containerd[1627]: time="2025-10-30T00:23:29.711811834Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:29.713709 containerd[1627]: time="2025-10-30T00:23:29.713686035Z" level=info msg="CreateContainer within sandbox \"d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Oct 30 00:23:29.719584 containerd[1627]: time="2025-10-30T00:23:29.718821914Z" level=info msg="Container 9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:23:29.728569 containerd[1627]: time="2025-10-30T00:23:29.728553002Z" level=info msg="CreateContainer within sandbox \"d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10\"" Oct 30 00:23:29.729015 containerd[1627]: time="2025-10-30T00:23:29.728897254Z" level=info msg="StartContainer for \"9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10\"" Oct 30 00:23:29.729700 containerd[1627]: time="2025-10-30T00:23:29.729686395Z" level=info msg="connecting to shim 9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10" address="unix:///run/containerd/s/20a415735b9291a077959f7501f6b9f9785c8518a7c0a04c5b5ba1a0816571a2" protocol=ttrpc version=3 Oct 30 00:23:29.746117 systemd[1]: Started cri-containerd-9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10.scope - libcontainer container 9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10. Oct 30 00:23:29.776149 containerd[1627]: time="2025-10-30T00:23:29.776125695Z" level=info msg="StartContainer for \"9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10\" returns successfully" Oct 30 00:23:30.166240 kubelet[2936]: E1030 00:23:30.165995 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:31.102090 systemd[1]: cri-containerd-9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10.scope: Deactivated successfully. Oct 30 00:23:31.102274 systemd[1]: cri-containerd-9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10.scope: Consumed 283ms CPU time, 164.6M memory peak, 6.1M read from disk, 171.3M written to disk. Oct 30 00:23:31.129701 containerd[1627]: time="2025-10-30T00:23:31.128924854Z" level=info msg="received exit event container_id:\"9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10\" id:\"9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10\" pid:3696 exited_at:{seconds:1761783811 nanos:128777236}" Oct 30 00:23:31.141579 containerd[1627]: time="2025-10-30T00:23:31.141550043Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10\" id:\"9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10\" pid:3696 exited_at:{seconds:1761783811 nanos:128777236}" Oct 30 00:23:31.157494 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9b3f4c54d7ab103ac7e6df2c7b6f2912e6279932714a84adb65296f89b614a10-rootfs.mount: Deactivated successfully. Oct 30 00:23:31.178763 kubelet[2936]: I1030 00:23:31.178746 2936 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Oct 30 00:23:31.449504 systemd[1]: Created slice kubepods-burstable-pod293dec99_0158_41c1_ba5d_1c31149fed67.slice - libcontainer container kubepods-burstable-pod293dec99_0158_41c1_ba5d_1c31149fed67.slice. Oct 30 00:23:31.453379 kubelet[2936]: E1030 00:23:31.453209 2936 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"whisker-ca-bundle\"" type="*v1.ConfigMap" Oct 30 00:23:31.453379 kubelet[2936]: E1030 00:23:31.453258 2936 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" reflector="object-\"calico-apiserver\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Oct 30 00:23:31.453665 kubelet[2936]: E1030 00:23:31.453577 2936 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" reflector="object-\"calico-apiserver\"/\"calico-apiserver-certs\"" type="*v1.Secret" Oct 30 00:23:31.457541 systemd[1]: Created slice kubepods-besteffort-pod3e1c4ef0_e4a3_4ed0_8db5_48e57bb8c5da.slice - libcontainer container kubepods-besteffort-pod3e1c4ef0_e4a3_4ed0_8db5_48e57bb8c5da.slice. Oct 30 00:23:31.466708 systemd[1]: Created slice kubepods-besteffort-pod61d50a52_6f08_4848_bb64_ac0260f39fa4.slice - libcontainer container kubepods-besteffort-pod61d50a52_6f08_4848_bb64_ac0260f39fa4.slice. Oct 30 00:23:31.473243 systemd[1]: Created slice kubepods-besteffort-pode46b88e1_56de_4271_a039_c9f0466406b0.slice - libcontainer container kubepods-besteffort-pode46b88e1_56de_4271_a039_c9f0466406b0.slice. Oct 30 00:23:31.481884 systemd[1]: Created slice kubepods-besteffort-pod31c13ead_4475_476c_a22d_7ad5dc225044.slice - libcontainer container kubepods-besteffort-pod31c13ead_4475_476c_a22d_7ad5dc225044.slice. Oct 30 00:23:31.487809 systemd[1]: Created slice kubepods-besteffort-podc7da7a4a_cbcd_4d6d_82c2_808e14afe4d4.slice - libcontainer container kubepods-besteffort-podc7da7a4a_cbcd_4d6d_82c2_808e14afe4d4.slice. Oct 30 00:23:31.493682 systemd[1]: Created slice kubepods-burstable-pod71507947_1ff6_4e84_bd85_143eb0794551.slice - libcontainer container kubepods-burstable-pod71507947_1ff6_4e84_bd85_143eb0794551.slice. Oct 30 00:23:31.500892 kubelet[2936]: I1030 00:23:31.500686 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvs96\" (UniqueName: \"kubernetes.io/projected/e46b88e1-56de-4271-a039-c9f0466406b0-kube-api-access-zvs96\") pod \"calico-apiserver-86d5586b69-4fl76\" (UID: \"e46b88e1-56de-4271-a039-c9f0466406b0\") " pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" Oct 30 00:23:31.500892 kubelet[2936]: I1030 00:23:31.500732 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4-goldmane-ca-bundle\") pod \"goldmane-666569f655-8bq2q\" (UID: \"c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4\") " pod="calico-system/goldmane-666569f655-8bq2q" Oct 30 00:23:31.501219 kubelet[2936]: I1030 00:23:31.501208 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbgpb\" (UniqueName: \"kubernetes.io/projected/293dec99-0158-41c1-ba5d-1c31149fed67-kube-api-access-jbgpb\") pod \"coredns-674b8bbfcf-9dbqd\" (UID: \"293dec99-0158-41c1-ba5d-1c31149fed67\") " pod="kube-system/coredns-674b8bbfcf-9dbqd" Oct 30 00:23:31.501422 kubelet[2936]: I1030 00:23:31.501282 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e46b88e1-56de-4271-a039-c9f0466406b0-calico-apiserver-certs\") pod \"calico-apiserver-86d5586b69-4fl76\" (UID: \"e46b88e1-56de-4271-a039-c9f0466406b0\") " pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" Oct 30 00:23:31.501422 kubelet[2936]: I1030 00:23:31.501296 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/31c13ead-4475-476c-a22d-7ad5dc225044-calico-apiserver-certs\") pod \"calico-apiserver-86d5586b69-b4v2f\" (UID: \"31c13ead-4475-476c-a22d-7ad5dc225044\") " pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" Oct 30 00:23:31.501422 kubelet[2936]: I1030 00:23:31.501308 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrn6g\" (UniqueName: \"kubernetes.io/projected/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-kube-api-access-qrn6g\") pod \"whisker-84779b458-475tl\" (UID: \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\") " pod="calico-system/whisker-84779b458-475tl" Oct 30 00:23:31.501422 kubelet[2936]: I1030 00:23:31.501318 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71507947-1ff6-4e84-bd85-143eb0794551-config-volume\") pod \"coredns-674b8bbfcf-qr8zt\" (UID: \"71507947-1ff6-4e84-bd85-143eb0794551\") " pod="kube-system/coredns-674b8bbfcf-qr8zt" Oct 30 00:23:31.501723 kubelet[2936]: I1030 00:23:31.501336 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4-config\") pod \"goldmane-666569f655-8bq2q\" (UID: \"c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4\") " pod="calico-system/goldmane-666569f655-8bq2q" Oct 30 00:23:31.501723 kubelet[2936]: I1030 00:23:31.501608 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wqpz\" (UniqueName: \"kubernetes.io/projected/71507947-1ff6-4e84-bd85-143eb0794551-kube-api-access-7wqpz\") pod \"coredns-674b8bbfcf-qr8zt\" (UID: \"71507947-1ff6-4e84-bd85-143eb0794551\") " pod="kube-system/coredns-674b8bbfcf-qr8zt" Oct 30 00:23:31.501723 kubelet[2936]: I1030 00:23:31.501626 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/293dec99-0158-41c1-ba5d-1c31149fed67-config-volume\") pod \"coredns-674b8bbfcf-9dbqd\" (UID: \"293dec99-0158-41c1-ba5d-1c31149fed67\") " pod="kube-system/coredns-674b8bbfcf-9dbqd" Oct 30 00:23:31.501723 kubelet[2936]: I1030 00:23:31.501639 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-ca-bundle\") pod \"whisker-84779b458-475tl\" (UID: \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\") " pod="calico-system/whisker-84779b458-475tl" Oct 30 00:23:31.501723 kubelet[2936]: I1030 00:23:31.501654 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfsw8\" (UniqueName: \"kubernetes.io/projected/c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4-kube-api-access-sfsw8\") pod \"goldmane-666569f655-8bq2q\" (UID: \"c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4\") " pod="calico-system/goldmane-666569f655-8bq2q" Oct 30 00:23:31.502020 kubelet[2936]: I1030 00:23:31.501905 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffscj\" (UniqueName: \"kubernetes.io/projected/31c13ead-4475-476c-a22d-7ad5dc225044-kube-api-access-ffscj\") pod \"calico-apiserver-86d5586b69-b4v2f\" (UID: \"31c13ead-4475-476c-a22d-7ad5dc225044\") " pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" Oct 30 00:23:31.502020 kubelet[2936]: I1030 00:23:31.501929 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-backend-key-pair\") pod \"whisker-84779b458-475tl\" (UID: \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\") " pod="calico-system/whisker-84779b458-475tl" Oct 30 00:23:31.502020 kubelet[2936]: I1030 00:23:31.501939 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4-goldmane-key-pair\") pod \"goldmane-666569f655-8bq2q\" (UID: \"c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4\") " pod="calico-system/goldmane-666569f655-8bq2q" Oct 30 00:23:31.503301 kubelet[2936]: I1030 00:23:31.502073 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d50a52-6f08-4848-bb64-ac0260f39fa4-tigera-ca-bundle\") pod \"calico-kube-controllers-7c8fdc95c4-n9pkj\" (UID: \"61d50a52-6f08-4848-bb64-ac0260f39fa4\") " pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" Oct 30 00:23:31.503301 kubelet[2936]: I1030 00:23:31.502092 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m7qj\" (UniqueName: \"kubernetes.io/projected/61d50a52-6f08-4848-bb64-ac0260f39fa4-kube-api-access-8m7qj\") pod \"calico-kube-controllers-7c8fdc95c4-n9pkj\" (UID: \"61d50a52-6f08-4848-bb64-ac0260f39fa4\") " pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" Oct 30 00:23:31.757212 containerd[1627]: time="2025-10-30T00:23:31.757147411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-9dbqd,Uid:293dec99-0158-41c1-ba5d-1c31149fed67,Namespace:kube-system,Attempt:0,}" Oct 30 00:23:31.770670 containerd[1627]: time="2025-10-30T00:23:31.770257294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c8fdc95c4-n9pkj,Uid:61d50a52-6f08-4848-bb64-ac0260f39fa4,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:31.798234 containerd[1627]: time="2025-10-30T00:23:31.798206267Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qr8zt,Uid:71507947-1ff6-4e84-bd85-143eb0794551,Namespace:kube-system,Attempt:0,}" Oct 30 00:23:31.814782 containerd[1627]: time="2025-10-30T00:23:31.814748235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-8bq2q,Uid:c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:31.951760 containerd[1627]: time="2025-10-30T00:23:31.951727935Z" level=error msg="Failed to destroy network for sandbox \"3cc481011edf185929ac9df1e3e5838c509ec7602d8d8f4ebbddb205e7d03dba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.953818 containerd[1627]: time="2025-10-30T00:23:31.953727619Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-8bq2q,Uid:c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc481011edf185929ac9df1e3e5838c509ec7602d8d8f4ebbddb205e7d03dba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.967904 containerd[1627]: time="2025-10-30T00:23:31.967821936Z" level=error msg="Failed to destroy network for sandbox \"af3c5c09791c1db840cab93e7e6338515dd8af280aebaee24a63a8e4584547ba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.971019 containerd[1627]: time="2025-10-30T00:23:31.971000896Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-9dbqd,Uid:293dec99-0158-41c1-ba5d-1c31149fed67,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"af3c5c09791c1db840cab93e7e6338515dd8af280aebaee24a63a8e4584547ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.971271 kubelet[2936]: E1030 00:23:31.971238 2936 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc481011edf185929ac9df1e3e5838c509ec7602d8d8f4ebbddb205e7d03dba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.971317 kubelet[2936]: E1030 00:23:31.971291 2936 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc481011edf185929ac9df1e3e5838c509ec7602d8d8f4ebbddb205e7d03dba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-8bq2q" Oct 30 00:23:31.971317 kubelet[2936]: E1030 00:23:31.971307 2936 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc481011edf185929ac9df1e3e5838c509ec7602d8d8f4ebbddb205e7d03dba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-8bq2q" Oct 30 00:23:31.971736 kubelet[2936]: E1030 00:23:31.971348 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-8bq2q_calico-system(c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-8bq2q_calico-system(c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3cc481011edf185929ac9df1e3e5838c509ec7602d8d8f4ebbddb205e7d03dba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:23:31.971736 kubelet[2936]: E1030 00:23:31.971363 2936 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af3c5c09791c1db840cab93e7e6338515dd8af280aebaee24a63a8e4584547ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.971736 kubelet[2936]: E1030 00:23:31.971386 2936 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af3c5c09791c1db840cab93e7e6338515dd8af280aebaee24a63a8e4584547ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-9dbqd" Oct 30 00:23:31.971816 kubelet[2936]: E1030 00:23:31.971397 2936 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af3c5c09791c1db840cab93e7e6338515dd8af280aebaee24a63a8e4584547ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-9dbqd" Oct 30 00:23:31.971816 kubelet[2936]: E1030 00:23:31.971417 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-9dbqd_kube-system(293dec99-0158-41c1-ba5d-1c31149fed67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-9dbqd_kube-system(293dec99-0158-41c1-ba5d-1c31149fed67)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"af3c5c09791c1db840cab93e7e6338515dd8af280aebaee24a63a8e4584547ba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-9dbqd" podUID="293dec99-0158-41c1-ba5d-1c31149fed67" Oct 30 00:23:31.979595 containerd[1627]: time="2025-10-30T00:23:31.979452646Z" level=error msg="Failed to destroy network for sandbox \"41281687746c9aaab9406214654a5a0fd224e47621914362604ba7145ff19ec7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.981711 containerd[1627]: time="2025-10-30T00:23:31.981576188Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qr8zt,Uid:71507947-1ff6-4e84-bd85-143eb0794551,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"41281687746c9aaab9406214654a5a0fd224e47621914362604ba7145ff19ec7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.982067 kubelet[2936]: E1030 00:23:31.982025 2936 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41281687746c9aaab9406214654a5a0fd224e47621914362604ba7145ff19ec7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.982108 kubelet[2936]: E1030 00:23:31.982070 2936 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41281687746c9aaab9406214654a5a0fd224e47621914362604ba7145ff19ec7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qr8zt" Oct 30 00:23:31.982108 kubelet[2936]: E1030 00:23:31.982085 2936 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41281687746c9aaab9406214654a5a0fd224e47621914362604ba7145ff19ec7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qr8zt" Oct 30 00:23:31.982354 kubelet[2936]: E1030 00:23:31.982112 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-qr8zt_kube-system(71507947-1ff6-4e84-bd85-143eb0794551)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-qr8zt_kube-system(71507947-1ff6-4e84-bd85-143eb0794551)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"41281687746c9aaab9406214654a5a0fd224e47621914362604ba7145ff19ec7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-qr8zt" podUID="71507947-1ff6-4e84-bd85-143eb0794551" Oct 30 00:23:31.989718 containerd[1627]: time="2025-10-30T00:23:31.989647825Z" level=error msg="Failed to destroy network for sandbox \"748fae904005e1fc0d3ba8df6ce5590df89f0306cd685e4cbdc867da17616733\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.990369 containerd[1627]: time="2025-10-30T00:23:31.990320439Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c8fdc95c4-n9pkj,Uid:61d50a52-6f08-4848-bb64-ac0260f39fa4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"748fae904005e1fc0d3ba8df6ce5590df89f0306cd685e4cbdc867da17616733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.990534 kubelet[2936]: E1030 00:23:31.990512 2936 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"748fae904005e1fc0d3ba8df6ce5590df89f0306cd685e4cbdc867da17616733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:31.990567 kubelet[2936]: E1030 00:23:31.990545 2936 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"748fae904005e1fc0d3ba8df6ce5590df89f0306cd685e4cbdc867da17616733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" Oct 30 00:23:31.990567 kubelet[2936]: E1030 00:23:31.990558 2936 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"748fae904005e1fc0d3ba8df6ce5590df89f0306cd685e4cbdc867da17616733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" Oct 30 00:23:31.990606 kubelet[2936]: E1030 00:23:31.990589 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c8fdc95c4-n9pkj_calico-system(61d50a52-6f08-4848-bb64-ac0260f39fa4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c8fdc95c4-n9pkj_calico-system(61d50a52-6f08-4848-bb64-ac0260f39fa4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"748fae904005e1fc0d3ba8df6ce5590df89f0306cd685e4cbdc867da17616733\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:23:32.157787 systemd[1]: run-netns-cni\x2d7e355954\x2ddcae\x2d40db\x2d4a3d\x2d50fe462a1c49.mount: Deactivated successfully. Oct 30 00:23:32.177049 systemd[1]: Created slice kubepods-besteffort-pod54feb653_7cf9_4ac1_9a8d_a45c98b9a230.slice - libcontainer container kubepods-besteffort-pod54feb653_7cf9_4ac1_9a8d_a45c98b9a230.slice. Oct 30 00:23:32.181132 containerd[1627]: time="2025-10-30T00:23:32.180937192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7hzkb,Uid:54feb653-7cf9-4ac1-9a8d-a45c98b9a230,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:32.216353 containerd[1627]: time="2025-10-30T00:23:32.216226212Z" level=error msg="Failed to destroy network for sandbox \"e98ebab530b5539a4aba5bb3a5a2d1db968bca0010378ee8e2608ab3b269677e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:32.217488 systemd[1]: run-netns-cni\x2d530a1dce\x2de493\x2d062b\x2d55fc\x2d790e35412165.mount: Deactivated successfully. Oct 30 00:23:32.217965 containerd[1627]: time="2025-10-30T00:23:32.217865453Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7hzkb,Uid:54feb653-7cf9-4ac1-9a8d-a45c98b9a230,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e98ebab530b5539a4aba5bb3a5a2d1db968bca0010378ee8e2608ab3b269677e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:32.218260 kubelet[2936]: E1030 00:23:32.218144 2936 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e98ebab530b5539a4aba5bb3a5a2d1db968bca0010378ee8e2608ab3b269677e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:32.218260 kubelet[2936]: E1030 00:23:32.218195 2936 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e98ebab530b5539a4aba5bb3a5a2d1db968bca0010378ee8e2608ab3b269677e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7hzkb" Oct 30 00:23:32.218260 kubelet[2936]: E1030 00:23:32.218210 2936 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e98ebab530b5539a4aba5bb3a5a2d1db968bca0010378ee8e2608ab3b269677e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7hzkb" Oct 30 00:23:32.218697 kubelet[2936]: E1030 00:23:32.218473 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7hzkb_calico-system(54feb653-7cf9-4ac1-9a8d-a45c98b9a230)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7hzkb_calico-system(54feb653-7cf9-4ac1-9a8d-a45c98b9a230)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e98ebab530b5539a4aba5bb3a5a2d1db968bca0010378ee8e2608ab3b269677e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:32.354346 containerd[1627]: time="2025-10-30T00:23:32.353971360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Oct 30 00:23:32.603261 kubelet[2936]: E1030 00:23:32.603234 2936 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 30 00:23:32.603370 kubelet[2936]: E1030 00:23:32.603313 2936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-ca-bundle podName:3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da nodeName:}" failed. No retries permitted until 2025-10-30 00:23:33.103299756 +0000 UTC m=+32.075339554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-ca-bundle") pod "whisker-84779b458-475tl" (UID: "3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da") : failed to sync configmap cache: timed out waiting for the condition Oct 30 00:23:32.606043 kubelet[2936]: E1030 00:23:32.605937 2936 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Oct 30 00:23:32.606043 kubelet[2936]: E1030 00:23:32.605978 2936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31c13ead-4475-476c-a22d-7ad5dc225044-calico-apiserver-certs podName:31c13ead-4475-476c-a22d-7ad5dc225044 nodeName:}" failed. No retries permitted until 2025-10-30 00:23:33.10596883 +0000 UTC m=+32.078008628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/31c13ead-4475-476c-a22d-7ad5dc225044-calico-apiserver-certs") pod "calico-apiserver-86d5586b69-b4v2f" (UID: "31c13ead-4475-476c-a22d-7ad5dc225044") : failed to sync secret cache: timed out waiting for the condition Oct 30 00:23:32.606131 kubelet[2936]: E1030 00:23:32.605938 2936 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Oct 30 00:23:32.606131 kubelet[2936]: E1030 00:23:32.606071 2936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e46b88e1-56de-4271-a039-c9f0466406b0-calico-apiserver-certs podName:e46b88e1-56de-4271-a039-c9f0466406b0 nodeName:}" failed. No retries permitted until 2025-10-30 00:23:33.106064348 +0000 UTC m=+32.078104146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/e46b88e1-56de-4271-a039-c9f0466406b0-calico-apiserver-certs") pod "calico-apiserver-86d5586b69-4fl76" (UID: "e46b88e1-56de-4271-a039-c9f0466406b0") : failed to sync secret cache: timed out waiting for the condition Oct 30 00:23:33.263705 containerd[1627]: time="2025-10-30T00:23:33.263636609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84779b458-475tl,Uid:3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:33.276602 containerd[1627]: time="2025-10-30T00:23:33.276499930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86d5586b69-4fl76,Uid:e46b88e1-56de-4271-a039-c9f0466406b0,Namespace:calico-apiserver,Attempt:0,}" Oct 30 00:23:33.285524 containerd[1627]: time="2025-10-30T00:23:33.285500095Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86d5586b69-b4v2f,Uid:31c13ead-4475-476c-a22d-7ad5dc225044,Namespace:calico-apiserver,Attempt:0,}" Oct 30 00:23:33.316182 containerd[1627]: time="2025-10-30T00:23:33.316155108Z" level=error msg="Failed to destroy network for sandbox \"98fe5a8fb83b585313820fe0fc3e77bbeb883c5b419469e3b9f0caf3df64b209\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.318707 systemd[1]: run-netns-cni\x2d005176dc\x2dd95a\x2dc55f\x2dde05\x2dbc06172aeeeb.mount: Deactivated successfully. Oct 30 00:23:33.320576 containerd[1627]: time="2025-10-30T00:23:33.320552239Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84779b458-475tl,Uid:3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"98fe5a8fb83b585313820fe0fc3e77bbeb883c5b419469e3b9f0caf3df64b209\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.321049 kubelet[2936]: E1030 00:23:33.320834 2936 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98fe5a8fb83b585313820fe0fc3e77bbeb883c5b419469e3b9f0caf3df64b209\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.321049 kubelet[2936]: E1030 00:23:33.320869 2936 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98fe5a8fb83b585313820fe0fc3e77bbeb883c5b419469e3b9f0caf3df64b209\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84779b458-475tl" Oct 30 00:23:33.321049 kubelet[2936]: E1030 00:23:33.320883 2936 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98fe5a8fb83b585313820fe0fc3e77bbeb883c5b419469e3b9f0caf3df64b209\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84779b458-475tl" Oct 30 00:23:33.321681 kubelet[2936]: E1030 00:23:33.320916 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-84779b458-475tl_calico-system(3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-84779b458-475tl_calico-system(3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"98fe5a8fb83b585313820fe0fc3e77bbeb883c5b419469e3b9f0caf3df64b209\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-84779b458-475tl" podUID="3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da" Oct 30 00:23:33.338247 containerd[1627]: time="2025-10-30T00:23:33.338138761Z" level=error msg="Failed to destroy network for sandbox \"0d63b35ebc1ada38477bd7e757c19d677a76b68d8737068e0651f60f227161f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.338743 containerd[1627]: time="2025-10-30T00:23:33.338725461Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86d5586b69-4fl76,Uid:e46b88e1-56de-4271-a039-c9f0466406b0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d63b35ebc1ada38477bd7e757c19d677a76b68d8737068e0651f60f227161f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.339052 kubelet[2936]: E1030 00:23:33.338953 2936 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d63b35ebc1ada38477bd7e757c19d677a76b68d8737068e0651f60f227161f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.339052 kubelet[2936]: E1030 00:23:33.338992 2936 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d63b35ebc1ada38477bd7e757c19d677a76b68d8737068e0651f60f227161f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" Oct 30 00:23:33.339052 kubelet[2936]: E1030 00:23:33.339007 2936 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d63b35ebc1ada38477bd7e757c19d677a76b68d8737068e0651f60f227161f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" Oct 30 00:23:33.339209 kubelet[2936]: E1030 00:23:33.339166 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-86d5586b69-4fl76_calico-apiserver(e46b88e1-56de-4271-a039-c9f0466406b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-86d5586b69-4fl76_calico-apiserver(e46b88e1-56de-4271-a039-c9f0466406b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d63b35ebc1ada38477bd7e757c19d677a76b68d8737068e0651f60f227161f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:23:33.346899 containerd[1627]: time="2025-10-30T00:23:33.346862376Z" level=error msg="Failed to destroy network for sandbox \"f393d6d006901dbc0335e18121c1479c3c3d72c6292ac8ea3b7560fa40fc824c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.347439 containerd[1627]: time="2025-10-30T00:23:33.347419578Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86d5586b69-b4v2f,Uid:31c13ead-4475-476c-a22d-7ad5dc225044,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f393d6d006901dbc0335e18121c1479c3c3d72c6292ac8ea3b7560fa40fc824c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.347583 kubelet[2936]: E1030 00:23:33.347558 2936 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f393d6d006901dbc0335e18121c1479c3c3d72c6292ac8ea3b7560fa40fc824c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 30 00:23:33.347617 kubelet[2936]: E1030 00:23:33.347599 2936 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f393d6d006901dbc0335e18121c1479c3c3d72c6292ac8ea3b7560fa40fc824c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" Oct 30 00:23:33.348620 kubelet[2936]: E1030 00:23:33.347617 2936 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f393d6d006901dbc0335e18121c1479c3c3d72c6292ac8ea3b7560fa40fc824c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" Oct 30 00:23:33.348620 kubelet[2936]: E1030 00:23:33.347651 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-86d5586b69-b4v2f_calico-apiserver(31c13ead-4475-476c-a22d-7ad5dc225044)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-86d5586b69-b4v2f_calico-apiserver(31c13ead-4475-476c-a22d-7ad5dc225044)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f393d6d006901dbc0335e18121c1479c3c3d72c6292ac8ea3b7560fa40fc824c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" podUID="31c13ead-4475-476c-a22d-7ad5dc225044" Oct 30 00:23:34.157841 systemd[1]: run-netns-cni\x2db3de5b7f\x2db768\x2d5ad4\x2d8a1e\x2d3b28387c5ccb.mount: Deactivated successfully. Oct 30 00:23:34.157899 systemd[1]: run-netns-cni\x2ddcf34683\x2d81f6\x2de3b0\x2d5969\x2d4d241b48bb39.mount: Deactivated successfully. Oct 30 00:23:39.863696 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2164642241.mount: Deactivated successfully. Oct 30 00:23:40.411777 containerd[1627]: time="2025-10-30T00:23:40.411620736Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:40.418268 containerd[1627]: time="2025-10-30T00:23:40.418220609Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Oct 30 00:23:40.459617 containerd[1627]: time="2025-10-30T00:23:40.459560427Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:40.467387 containerd[1627]: time="2025-10-30T00:23:40.467344481Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 30 00:23:40.467790 containerd[1627]: time="2025-10-30T00:23:40.467584843Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 8.113587375s" Oct 30 00:23:40.467790 containerd[1627]: time="2025-10-30T00:23:40.467603787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Oct 30 00:23:40.502364 containerd[1627]: time="2025-10-30T00:23:40.502341902Z" level=info msg="CreateContainer within sandbox \"d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Oct 30 00:23:40.745660 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount403072907.mount: Deactivated successfully. Oct 30 00:23:40.747566 containerd[1627]: time="2025-10-30T00:23:40.746615283Z" level=info msg="Container 56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:23:40.843739 containerd[1627]: time="2025-10-30T00:23:40.843706196Z" level=info msg="CreateContainer within sandbox \"d57b6e188bb1da1d651b3532297c23321dc5e2acc7e3d2b0c01f37fafff3054a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a\"" Oct 30 00:23:40.844404 containerd[1627]: time="2025-10-30T00:23:40.844389815Z" level=info msg="StartContainer for \"56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a\"" Oct 30 00:23:40.849144 containerd[1627]: time="2025-10-30T00:23:40.848953546Z" level=info msg="connecting to shim 56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a" address="unix:///run/containerd/s/20a415735b9291a077959f7501f6b9f9785c8518a7c0a04c5b5ba1a0816571a2" protocol=ttrpc version=3 Oct 30 00:23:40.944231 systemd[1]: Started cri-containerd-56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a.scope - libcontainer container 56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a. Oct 30 00:23:40.988505 containerd[1627]: time="2025-10-30T00:23:40.988480454Z" level=info msg="StartContainer for \"56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a\" returns successfully" Oct 30 00:23:41.132622 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Oct 30 00:23:41.134898 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Oct 30 00:23:41.478432 kubelet[2936]: I1030 00:23:41.475580 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9vl72" podStartSLOduration=2.307309122 podStartE2EDuration="21.475568559s" podCreationTimestamp="2025-10-30 00:23:20 +0000 UTC" firstStartedPulling="2025-10-30 00:23:21.300863107 +0000 UTC m=+20.272902905" lastFinishedPulling="2025-10-30 00:23:40.46912255 +0000 UTC m=+39.441162342" observedRunningTime="2025-10-30 00:23:41.475361609 +0000 UTC m=+40.447401405" watchObservedRunningTime="2025-10-30 00:23:41.475568559 +0000 UTC m=+40.447608357" Oct 30 00:23:41.562383 kubelet[2936]: I1030 00:23:41.562075 2936 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrn6g\" (UniqueName: \"kubernetes.io/projected/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-kube-api-access-qrn6g\") pod \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\" (UID: \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\") " Oct 30 00:23:41.562383 kubelet[2936]: I1030 00:23:41.562145 2936 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-backend-key-pair\") pod \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\" (UID: \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\") " Oct 30 00:23:41.562383 kubelet[2936]: I1030 00:23:41.562164 2936 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-ca-bundle\") pod \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\" (UID: \"3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da\") " Oct 30 00:23:41.572341 systemd[1]: var-lib-kubelet-pods-3e1c4ef0\x2de4a3\x2d4ed0\x2d8db5\x2d48e57bb8c5da-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Oct 30 00:23:41.575167 kubelet[2936]: I1030 00:23:41.575125 2936 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da" (UID: "3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Oct 30 00:23:41.575335 kubelet[2936]: I1030 00:23:41.575280 2936 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da" (UID: "3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Oct 30 00:23:41.576132 kubelet[2936]: I1030 00:23:41.576113 2936 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-kube-api-access-qrn6g" (OuterVolumeSpecName: "kube-api-access-qrn6g") pod "3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da" (UID: "3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da"). InnerVolumeSpecName "kube-api-access-qrn6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Oct 30 00:23:41.576253 systemd[1]: var-lib-kubelet-pods-3e1c4ef0\x2de4a3\x2d4ed0\x2d8db5\x2d48e57bb8c5da-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqrn6g.mount: Deactivated successfully. Oct 30 00:23:41.601874 containerd[1627]: time="2025-10-30T00:23:41.601835791Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a\" id:\"7eeb30f581facfae4a0d40c922132b264110cc3db5a7f8c35f2c37ea01a2f116\" pid:4028 exit_status:1 exited_at:{seconds:1761783821 nanos:601636885}" Oct 30 00:23:41.663211 kubelet[2936]: I1030 00:23:41.663172 2936 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrn6g\" (UniqueName: \"kubernetes.io/projected/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-kube-api-access-qrn6g\") on node \"localhost\" DevicePath \"\"" Oct 30 00:23:41.663211 kubelet[2936]: I1030 00:23:41.663199 2936 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Oct 30 00:23:41.663211 kubelet[2936]: I1030 00:23:41.663208 2936 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Oct 30 00:23:41.754193 systemd[1]: Removed slice kubepods-besteffort-pod3e1c4ef0_e4a3_4ed0_8db5_48e57bb8c5da.slice - libcontainer container kubepods-besteffort-pod3e1c4ef0_e4a3_4ed0_8db5_48e57bb8c5da.slice. Oct 30 00:23:41.821045 systemd[1]: Created slice kubepods-besteffort-podad72706a_e193_45fa_8c06_007a36e881f3.slice - libcontainer container kubepods-besteffort-podad72706a_e193_45fa_8c06_007a36e881f3.slice. Oct 30 00:23:41.864049 kubelet[2936]: I1030 00:23:41.863969 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhdtk\" (UniqueName: \"kubernetes.io/projected/ad72706a-e193-45fa-8c06-007a36e881f3-kube-api-access-rhdtk\") pod \"whisker-c95cf544d-r474l\" (UID: \"ad72706a-e193-45fa-8c06-007a36e881f3\") " pod="calico-system/whisker-c95cf544d-r474l" Oct 30 00:23:41.864049 kubelet[2936]: I1030 00:23:41.864002 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ad72706a-e193-45fa-8c06-007a36e881f3-whisker-backend-key-pair\") pod \"whisker-c95cf544d-r474l\" (UID: \"ad72706a-e193-45fa-8c06-007a36e881f3\") " pod="calico-system/whisker-c95cf544d-r474l" Oct 30 00:23:41.864049 kubelet[2936]: I1030 00:23:41.864018 2936 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad72706a-e193-45fa-8c06-007a36e881f3-whisker-ca-bundle\") pod \"whisker-c95cf544d-r474l\" (UID: \"ad72706a-e193-45fa-8c06-007a36e881f3\") " pod="calico-system/whisker-c95cf544d-r474l" Oct 30 00:23:42.123272 containerd[1627]: time="2025-10-30T00:23:42.123245919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c95cf544d-r474l,Uid:ad72706a-e193-45fa-8c06-007a36e881f3,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:42.752322 systemd-networkd[1507]: calib53c58b7370: Link UP Oct 30 00:23:42.753364 systemd-networkd[1507]: calib53c58b7370: Gained carrier Oct 30 00:23:42.756545 containerd[1627]: time="2025-10-30T00:23:42.756098641Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a\" id:\"8d814c72b5c5fd8d332579081acaea9454c49a64baaf3ff778e7864244e33c06\" pid:4117 exit_status:1 exited_at:{seconds:1761783822 nanos:755601509}" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.230 [INFO][4047] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.322 [INFO][4047] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--c95cf544d--r474l-eth0 whisker-c95cf544d- calico-system ad72706a-e193-45fa-8c06-007a36e881f3 894 0 2025-10-30 00:23:41 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:c95cf544d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-c95cf544d-r474l eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib53c58b7370 [] [] }} ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Namespace="calico-system" Pod="whisker-c95cf544d-r474l" WorkloadEndpoint="localhost-k8s-whisker--c95cf544d--r474l-" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.322 [INFO][4047] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Namespace="calico-system" Pod="whisker-c95cf544d-r474l" WorkloadEndpoint="localhost-k8s-whisker--c95cf544d--r474l-eth0" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.661 [INFO][4055] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" HandleID="k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Workload="localhost-k8s-whisker--c95cf544d--r474l-eth0" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.666 [INFO][4055] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" HandleID="k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Workload="localhost-k8s-whisker--c95cf544d--r474l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103b70), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-c95cf544d-r474l", "timestamp":"2025-10-30 00:23:42.661484991 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.666 [INFO][4055] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.668 [INFO][4055] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.668 [INFO][4055] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.696 [INFO][4055] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.713 [INFO][4055] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.720 [INFO][4055] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.722 [INFO][4055] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.724 [INFO][4055] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.724 [INFO][4055] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.725 [INFO][4055] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.727 [INFO][4055] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.732 [INFO][4055] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.732 [INFO][4055] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" host="localhost" Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.732 [INFO][4055] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 30 00:23:42.764167 containerd[1627]: 2025-10-30 00:23:42.732 [INFO][4055] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" HandleID="k8s-pod-network.94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Workload="localhost-k8s-whisker--c95cf544d--r474l-eth0" Oct 30 00:23:42.764591 containerd[1627]: 2025-10-30 00:23:42.734 [INFO][4047] cni-plugin/k8s.go 418: Populated endpoint ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Namespace="calico-system" Pod="whisker-c95cf544d-r474l" WorkloadEndpoint="localhost-k8s-whisker--c95cf544d--r474l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--c95cf544d--r474l-eth0", GenerateName:"whisker-c95cf544d-", Namespace:"calico-system", SelfLink:"", UID:"ad72706a-e193-45fa-8c06-007a36e881f3", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"c95cf544d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-c95cf544d-r474l", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib53c58b7370", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:42.764591 containerd[1627]: 2025-10-30 00:23:42.734 [INFO][4047] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Namespace="calico-system" Pod="whisker-c95cf544d-r474l" WorkloadEndpoint="localhost-k8s-whisker--c95cf544d--r474l-eth0" Oct 30 00:23:42.764591 containerd[1627]: 2025-10-30 00:23:42.734 [INFO][4047] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib53c58b7370 ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Namespace="calico-system" Pod="whisker-c95cf544d-r474l" WorkloadEndpoint="localhost-k8s-whisker--c95cf544d--r474l-eth0" Oct 30 00:23:42.764591 containerd[1627]: 2025-10-30 00:23:42.748 [INFO][4047] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Namespace="calico-system" Pod="whisker-c95cf544d-r474l" WorkloadEndpoint="localhost-k8s-whisker--c95cf544d--r474l-eth0" Oct 30 00:23:42.764591 containerd[1627]: 2025-10-30 00:23:42.748 [INFO][4047] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Namespace="calico-system" Pod="whisker-c95cf544d-r474l" WorkloadEndpoint="localhost-k8s-whisker--c95cf544d--r474l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--c95cf544d--r474l-eth0", GenerateName:"whisker-c95cf544d-", Namespace:"calico-system", SelfLink:"", UID:"ad72706a-e193-45fa-8c06-007a36e881f3", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"c95cf544d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c", Pod:"whisker-c95cf544d-r474l", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib53c58b7370", MAC:"56:2b:e4:1a:37:0f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:42.764591 containerd[1627]: 2025-10-30 00:23:42.759 [INFO][4047] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" Namespace="calico-system" Pod="whisker-c95cf544d-r474l" WorkloadEndpoint="localhost-k8s-whisker--c95cf544d--r474l-eth0" Oct 30 00:23:42.880836 containerd[1627]: time="2025-10-30T00:23:42.880799302Z" level=info msg="connecting to shim 94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c" address="unix:///run/containerd/s/aafc61fafbc3784a2251693897691c32e05e279d750dcc6eefcbfcab05d62906" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:42.903204 systemd[1]: Started cri-containerd-94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c.scope - libcontainer container 94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c. Oct 30 00:23:42.917093 systemd-resolved[1508]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 30 00:23:42.953868 containerd[1627]: time="2025-10-30T00:23:42.953286610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c95cf544d-r474l,Uid:ad72706a-e193-45fa-8c06-007a36e881f3,Namespace:calico-system,Attempt:0,} returns sandbox id \"94685c282254c940db20287f623bf1c7326263163829a973d964ba2d3224b54c\"" Oct 30 00:23:42.962067 containerd[1627]: time="2025-10-30T00:23:42.957985528Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 30 00:23:43.174394 containerd[1627]: time="2025-10-30T00:23:43.174206623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qr8zt,Uid:71507947-1ff6-4e84-bd85-143eb0794551,Namespace:kube-system,Attempt:0,}" Oct 30 00:23:43.175045 containerd[1627]: time="2025-10-30T00:23:43.175014975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c8fdc95c4-n9pkj,Uid:61d50a52-6f08-4848-bb64-ac0260f39fa4,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:43.176161 kubelet[2936]: I1030 00:23:43.176145 2936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da" path="/var/lib/kubelet/pods/3e1c4ef0-e4a3-4ed0-8db5-48e57bb8c5da/volumes" Oct 30 00:23:43.241780 systemd-networkd[1507]: vxlan.calico: Link UP Oct 30 00:23:43.241785 systemd-networkd[1507]: vxlan.calico: Gained carrier Oct 30 00:23:43.329267 containerd[1627]: time="2025-10-30T00:23:43.329123061Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:43.336718 containerd[1627]: time="2025-10-30T00:23:43.336630668Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 30 00:23:43.336718 containerd[1627]: time="2025-10-30T00:23:43.336690081Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 30 00:23:43.409603 systemd-networkd[1507]: calia2b9af62ba6: Link UP Oct 30 00:23:43.409734 systemd-networkd[1507]: calia2b9af62ba6: Gained carrier Oct 30 00:23:43.430767 kubelet[2936]: E1030 00:23:43.430593 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 30 00:23:43.430870 kubelet[2936]: E1030 00:23:43.430856 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.268 [INFO][4273] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0 coredns-674b8bbfcf- kube-system 71507947-1ff6-4e84-bd85-143eb0794551 821 0 2025-10-30 00:23:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-qr8zt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia2b9af62ba6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Namespace="kube-system" Pod="coredns-674b8bbfcf-qr8zt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qr8zt-" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.268 [INFO][4273] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Namespace="kube-system" Pod="coredns-674b8bbfcf-qr8zt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.355 [INFO][4311] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" HandleID="k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Workload="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.355 [INFO][4311] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" HandleID="k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Workload="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fdf0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-qr8zt", "timestamp":"2025-10-30 00:23:43.355161644 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.355 [INFO][4311] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.355 [INFO][4311] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.355 [INFO][4311] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.366 [INFO][4311] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.370 [INFO][4311] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.375 [INFO][4311] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.379 [INFO][4311] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.380 [INFO][4311] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.380 [INFO][4311] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.382 [INFO][4311] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671 Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.395 [INFO][4311] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.400 [INFO][4311] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.400 [INFO][4311] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" host="localhost" Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.400 [INFO][4311] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 30 00:23:43.442919 containerd[1627]: 2025-10-30 00:23:43.400 [INFO][4311] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" HandleID="k8s-pod-network.555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Workload="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" Oct 30 00:23:43.457088 containerd[1627]: 2025-10-30 00:23:43.405 [INFO][4273] cni-plugin/k8s.go 418: Populated endpoint ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Namespace="kube-system" Pod="coredns-674b8bbfcf-qr8zt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"71507947-1ff6-4e84-bd85-143eb0794551", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-qr8zt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia2b9af62ba6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:43.457088 containerd[1627]: 2025-10-30 00:23:43.405 [INFO][4273] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Namespace="kube-system" Pod="coredns-674b8bbfcf-qr8zt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" Oct 30 00:23:43.457088 containerd[1627]: 2025-10-30 00:23:43.405 [INFO][4273] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia2b9af62ba6 ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Namespace="kube-system" Pod="coredns-674b8bbfcf-qr8zt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" Oct 30 00:23:43.457088 containerd[1627]: 2025-10-30 00:23:43.410 [INFO][4273] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Namespace="kube-system" Pod="coredns-674b8bbfcf-qr8zt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" Oct 30 00:23:43.457088 containerd[1627]: 2025-10-30 00:23:43.410 [INFO][4273] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Namespace="kube-system" Pod="coredns-674b8bbfcf-qr8zt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"71507947-1ff6-4e84-bd85-143eb0794551", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671", Pod:"coredns-674b8bbfcf-qr8zt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia2b9af62ba6", MAC:"26:a8:5c:36:15:70", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:43.457236 containerd[1627]: 2025-10-30 00:23:43.437 [INFO][4273] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" Namespace="kube-system" Pod="coredns-674b8bbfcf-qr8zt" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--qr8zt-eth0" Oct 30 00:23:43.531878 containerd[1627]: time="2025-10-30T00:23:43.531830517Z" level=info msg="connecting to shim 555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671" address="unix:///run/containerd/s/8b9d8d45fa58d2fefab12be062c9766935a0fd01bfb0c3c28c7eaf17b1e4257e" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:43.548244 systemd-networkd[1507]: cali735865360d9: Link UP Oct 30 00:23:43.549289 systemd-networkd[1507]: cali735865360d9: Gained carrier Oct 30 00:23:43.566125 systemd[1]: Started cri-containerd-555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671.scope - libcontainer container 555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671. Oct 30 00:23:43.575000 systemd-resolved[1508]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.370 [INFO][4287] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0 calico-kube-controllers-7c8fdc95c4- calico-system 61d50a52-6f08-4848-bb64-ac0260f39fa4 819 0 2025-10-30 00:23:21 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7c8fdc95c4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7c8fdc95c4-n9pkj eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali735865360d9 [] [] }} ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Namespace="calico-system" Pod="calico-kube-controllers-7c8fdc95c4-n9pkj" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.370 [INFO][4287] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Namespace="calico-system" Pod="calico-kube-controllers-7c8fdc95c4-n9pkj" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.449 [INFO][4330] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" HandleID="k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Workload="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.451 [INFO][4330] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" HandleID="k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Workload="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e140), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7c8fdc95c4-n9pkj", "timestamp":"2025-10-30 00:23:43.449223543 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.451 [INFO][4330] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.451 [INFO][4330] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.451 [INFO][4330] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.467 [INFO][4330] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.471 [INFO][4330] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.479 [INFO][4330] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.482 [INFO][4330] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.485 [INFO][4330] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.486 [INFO][4330] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.489 [INFO][4330] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.499 [INFO][4330] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.542 [INFO][4330] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.542 [INFO][4330] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" host="localhost" Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.542 [INFO][4330] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 30 00:23:43.579512 containerd[1627]: 2025-10-30 00:23:43.542 [INFO][4330] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" HandleID="k8s-pod-network.e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Workload="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" Oct 30 00:23:43.580173 containerd[1627]: 2025-10-30 00:23:43.545 [INFO][4287] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Namespace="calico-system" Pod="calico-kube-controllers-7c8fdc95c4-n9pkj" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0", GenerateName:"calico-kube-controllers-7c8fdc95c4-", Namespace:"calico-system", SelfLink:"", UID:"61d50a52-6f08-4848-bb64-ac0260f39fa4", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c8fdc95c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7c8fdc95c4-n9pkj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali735865360d9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:43.580173 containerd[1627]: 2025-10-30 00:23:43.545 [INFO][4287] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Namespace="calico-system" Pod="calico-kube-controllers-7c8fdc95c4-n9pkj" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" Oct 30 00:23:43.580173 containerd[1627]: 2025-10-30 00:23:43.545 [INFO][4287] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali735865360d9 ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Namespace="calico-system" Pod="calico-kube-controllers-7c8fdc95c4-n9pkj" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" Oct 30 00:23:43.580173 containerd[1627]: 2025-10-30 00:23:43.549 [INFO][4287] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Namespace="calico-system" Pod="calico-kube-controllers-7c8fdc95c4-n9pkj" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" Oct 30 00:23:43.580173 containerd[1627]: 2025-10-30 00:23:43.550 [INFO][4287] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Namespace="calico-system" Pod="calico-kube-controllers-7c8fdc95c4-n9pkj" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0", GenerateName:"calico-kube-controllers-7c8fdc95c4-", Namespace:"calico-system", SelfLink:"", UID:"61d50a52-6f08-4848-bb64-ac0260f39fa4", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c8fdc95c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b", Pod:"calico-kube-controllers-7c8fdc95c4-n9pkj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali735865360d9", MAC:"02:58:21:1b:96:29", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:43.580173 containerd[1627]: 2025-10-30 00:23:43.576 [INFO][4287] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" Namespace="calico-system" Pod="calico-kube-controllers-7c8fdc95c4-n9pkj" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7c8fdc95c4--n9pkj-eth0" Oct 30 00:23:43.584792 kubelet[2936]: E1030 00:23:43.584754 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:00f8961b33664e97801f034c95c42fce,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhdtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c95cf544d-r474l_calico-system(ad72706a-e193-45fa-8c06-007a36e881f3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:43.609360 containerd[1627]: time="2025-10-30T00:23:43.609122233Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 30 00:23:43.623447 containerd[1627]: time="2025-10-30T00:23:43.623278756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qr8zt,Uid:71507947-1ff6-4e84-bd85-143eb0794551,Namespace:kube-system,Attempt:0,} returns sandbox id \"555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671\"" Oct 30 00:23:43.672167 containerd[1627]: time="2025-10-30T00:23:43.672140606Z" level=info msg="connecting to shim e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b" address="unix:///run/containerd/s/d143b201a8599577530ef708e21e1b406291b5db9f0228c670ad8c9916e2ffbc" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:43.692117 systemd[1]: Started cri-containerd-e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b.scope - libcontainer container e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b. Oct 30 00:23:43.701894 systemd-resolved[1508]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 30 00:23:43.735492 containerd[1627]: time="2025-10-30T00:23:43.735309901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c8fdc95c4-n9pkj,Uid:61d50a52-6f08-4848-bb64-ac0260f39fa4,Namespace:calico-system,Attempt:0,} returns sandbox id \"e8ca8e9a93ff940eab1b9345058a4d47ff78f2ef8c0d44885d5bd2843a21b46b\"" Oct 30 00:23:43.755204 containerd[1627]: time="2025-10-30T00:23:43.755176130Z" level=info msg="CreateContainer within sandbox \"555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 30 00:23:43.776166 containerd[1627]: time="2025-10-30T00:23:43.776136164Z" level=info msg="Container ab55131388128de80b1000cf92e22e5c1eb2e810bff19a6b8207a2460c90b2eb: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:23:43.781365 containerd[1627]: time="2025-10-30T00:23:43.781279397Z" level=info msg="CreateContainer within sandbox \"555e9f86897dcab76a51055b03946270a3a4f549def7ecfdfbf64b3455b79671\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ab55131388128de80b1000cf92e22e5c1eb2e810bff19a6b8207a2460c90b2eb\"" Oct 30 00:23:43.781804 containerd[1627]: time="2025-10-30T00:23:43.781638370Z" level=info msg="StartContainer for \"ab55131388128de80b1000cf92e22e5c1eb2e810bff19a6b8207a2460c90b2eb\"" Oct 30 00:23:43.783183 containerd[1627]: time="2025-10-30T00:23:43.783108203Z" level=info msg="connecting to shim ab55131388128de80b1000cf92e22e5c1eb2e810bff19a6b8207a2460c90b2eb" address="unix:///run/containerd/s/8b9d8d45fa58d2fefab12be062c9766935a0fd01bfb0c3c28c7eaf17b1e4257e" protocol=ttrpc version=3 Oct 30 00:23:43.802240 systemd[1]: Started cri-containerd-ab55131388128de80b1000cf92e22e5c1eb2e810bff19a6b8207a2460c90b2eb.scope - libcontainer container ab55131388128de80b1000cf92e22e5c1eb2e810bff19a6b8207a2460c90b2eb. Oct 30 00:23:43.833471 containerd[1627]: time="2025-10-30T00:23:43.833445315Z" level=info msg="StartContainer for \"ab55131388128de80b1000cf92e22e5c1eb2e810bff19a6b8207a2460c90b2eb\" returns successfully" Oct 30 00:23:43.988568 containerd[1627]: time="2025-10-30T00:23:43.988341946Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:43.988842 containerd[1627]: time="2025-10-30T00:23:43.988809370Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 30 00:23:43.988885 containerd[1627]: time="2025-10-30T00:23:43.988875244Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 30 00:23:43.989131 kubelet[2936]: E1030 00:23:43.988984 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 30 00:23:43.989131 kubelet[2936]: E1030 00:23:43.989017 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 30 00:23:43.989297 containerd[1627]: time="2025-10-30T00:23:43.989261478Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 30 00:23:43.989386 kubelet[2936]: E1030 00:23:43.989346 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhdtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c95cf544d-r474l_calico-system(ad72706a-e193-45fa-8c06-007a36e881f3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:43.990986 kubelet[2936]: E1030 00:23:43.990714 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-c95cf544d-r474l" podUID="ad72706a-e193-45fa-8c06-007a36e881f3" Oct 30 00:23:44.166838 containerd[1627]: time="2025-10-30T00:23:44.166719186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86d5586b69-4fl76,Uid:e46b88e1-56de-4271-a039-c9f0466406b0,Namespace:calico-apiserver,Attempt:0,}" Oct 30 00:23:44.288886 systemd-networkd[1507]: cali0760204f3ea: Link UP Oct 30 00:23:44.289904 systemd-networkd[1507]: cali0760204f3ea: Gained carrier Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.227 [INFO][4514] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0 calico-apiserver-86d5586b69- calico-apiserver e46b88e1-56de-4271-a039-c9f0466406b0 815 0 2025-10-30 00:23:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:86d5586b69 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-86d5586b69-4fl76 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0760204f3ea [] [] }} ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-4fl76" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--4fl76-" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.227 [INFO][4514] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-4fl76" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.243 [INFO][4525] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" HandleID="k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Workload="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.243 [INFO][4525] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" HandleID="k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Workload="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f0f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-86d5586b69-4fl76", "timestamp":"2025-10-30 00:23:44.243780607 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.243 [INFO][4525] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.243 [INFO][4525] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.244 [INFO][4525] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.251 [INFO][4525] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.254 [INFO][4525] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.256 [INFO][4525] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.257 [INFO][4525] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.259 [INFO][4525] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.259 [INFO][4525] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.260 [INFO][4525] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991 Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.270 [INFO][4525] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.283 [INFO][4525] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.283 [INFO][4525] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" host="localhost" Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.283 [INFO][4525] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 30 00:23:44.316982 containerd[1627]: 2025-10-30 00:23:44.283 [INFO][4525] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" HandleID="k8s-pod-network.fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Workload="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" Oct 30 00:23:44.339613 containerd[1627]: 2025-10-30 00:23:44.285 [INFO][4514] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-4fl76" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0", GenerateName:"calico-apiserver-86d5586b69-", Namespace:"calico-apiserver", SelfLink:"", UID:"e46b88e1-56de-4271-a039-c9f0466406b0", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86d5586b69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-86d5586b69-4fl76", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0760204f3ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:44.339613 containerd[1627]: 2025-10-30 00:23:44.285 [INFO][4514] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-4fl76" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" Oct 30 00:23:44.339613 containerd[1627]: 2025-10-30 00:23:44.285 [INFO][4514] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0760204f3ea ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-4fl76" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" Oct 30 00:23:44.339613 containerd[1627]: 2025-10-30 00:23:44.290 [INFO][4514] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-4fl76" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" Oct 30 00:23:44.339613 containerd[1627]: 2025-10-30 00:23:44.290 [INFO][4514] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-4fl76" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0", GenerateName:"calico-apiserver-86d5586b69-", Namespace:"calico-apiserver", SelfLink:"", UID:"e46b88e1-56de-4271-a039-c9f0466406b0", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86d5586b69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991", Pod:"calico-apiserver-86d5586b69-4fl76", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0760204f3ea", MAC:"6a:03:e5:a9:2f:47", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:44.339613 containerd[1627]: 2025-10-30 00:23:44.313 [INFO][4514] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-4fl76" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--4fl76-eth0" Oct 30 00:23:44.345579 containerd[1627]: time="2025-10-30T00:23:44.345549186Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:44.347574 containerd[1627]: time="2025-10-30T00:23:44.347529704Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 30 00:23:44.347700 containerd[1627]: time="2025-10-30T00:23:44.347683639Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 30 00:23:44.347914 kubelet[2936]: E1030 00:23:44.347870 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 30 00:23:44.348623 kubelet[2936]: E1030 00:23:44.347929 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 30 00:23:44.348623 kubelet[2936]: E1030 00:23:44.348134 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8m7qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7c8fdc95c4-n9pkj_calico-system(61d50a52-6f08-4848-bb64-ac0260f39fa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:44.349639 kubelet[2936]: E1030 00:23:44.349595 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:23:44.363554 containerd[1627]: time="2025-10-30T00:23:44.363282149Z" level=info msg="connecting to shim fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991" address="unix:///run/containerd/s/59c81e418d131220047d50a7288d0a9bfffc42026020e79c3f373c95bdcaa553" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:44.380141 systemd[1]: Started cri-containerd-fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991.scope - libcontainer container fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991. Oct 30 00:23:44.388768 systemd-resolved[1508]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 30 00:23:44.417006 containerd[1627]: time="2025-10-30T00:23:44.416965056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86d5586b69-4fl76,Uid:e46b88e1-56de-4271-a039-c9f0466406b0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fb69311179f5a47c1868d48483af3ea5a183386f9d30892d2b62b8cfcb960991\"" Oct 30 00:23:44.418315 containerd[1627]: time="2025-10-30T00:23:44.418201255Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 30 00:23:44.500130 systemd-networkd[1507]: vxlan.calico: Gained IPv6LL Oct 30 00:23:44.692124 systemd-networkd[1507]: calia2b9af62ba6: Gained IPv6LL Oct 30 00:23:44.692619 systemd-networkd[1507]: calib53c58b7370: Gained IPv6LL Oct 30 00:23:44.739178 kubelet[2936]: E1030 00:23:44.738879 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:23:44.759375 kubelet[2936]: E1030 00:23:44.759319 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-c95cf544d-r474l" podUID="ad72706a-e193-45fa-8c06-007a36e881f3" Oct 30 00:23:44.786256 containerd[1627]: time="2025-10-30T00:23:44.786204859Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:44.786608 containerd[1627]: time="2025-10-30T00:23:44.786589814Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 30 00:23:44.786673 containerd[1627]: time="2025-10-30T00:23:44.786657095Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 30 00:23:44.787912 kubelet[2936]: E1030 00:23:44.787469 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:23:44.787912 kubelet[2936]: E1030 00:23:44.787504 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:23:44.787912 kubelet[2936]: E1030 00:23:44.787618 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvs96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-86d5586b69-4fl76_calico-apiserver(e46b88e1-56de-4271-a039-c9f0466406b0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:44.789048 kubelet[2936]: E1030 00:23:44.788972 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:23:44.822158 kubelet[2936]: I1030 00:23:44.802733 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-qr8zt" podStartSLOduration=37.802717091 podStartE2EDuration="37.802717091s" podCreationTimestamp="2025-10-30 00:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-30 00:23:44.801334792 +0000 UTC m=+43.773374588" watchObservedRunningTime="2025-10-30 00:23:44.802717091 +0000 UTC m=+43.774756895" Oct 30 00:23:45.140168 systemd-networkd[1507]: cali735865360d9: Gained IPv6LL Oct 30 00:23:45.167503 containerd[1627]: time="2025-10-30T00:23:45.167407957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-9dbqd,Uid:293dec99-0158-41c1-ba5d-1c31149fed67,Namespace:kube-system,Attempt:0,}" Oct 30 00:23:45.168095 containerd[1627]: time="2025-10-30T00:23:45.167995912Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7hzkb,Uid:54feb653-7cf9-4ac1-9a8d-a45c98b9a230,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:45.283988 systemd-networkd[1507]: cali6c0bfebcc3f: Link UP Oct 30 00:23:45.284796 systemd-networkd[1507]: cali6c0bfebcc3f: Gained carrier Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.214 [INFO][4590] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0 coredns-674b8bbfcf- kube-system 293dec99-0158-41c1-ba5d-1c31149fed67 810 0 2025-10-30 00:23:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-9dbqd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6c0bfebcc3f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Namespace="kube-system" Pod="coredns-674b8bbfcf-9dbqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--9dbqd-" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.214 [INFO][4590] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Namespace="kube-system" Pod="coredns-674b8bbfcf-9dbqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.238 [INFO][4617] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" HandleID="k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Workload="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.238 [INFO][4617] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" HandleID="k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Workload="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d50f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-9dbqd", "timestamp":"2025-10-30 00:23:45.238749567 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.238 [INFO][4617] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.239 [INFO][4617] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.239 [INFO][4617] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.244 [INFO][4617] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.248 [INFO][4617] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.252 [INFO][4617] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.254 [INFO][4617] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.257 [INFO][4617] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.257 [INFO][4617] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.259 [INFO][4617] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0 Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.263 [INFO][4617] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.270 [INFO][4617] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.271 [INFO][4617] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" host="localhost" Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.271 [INFO][4617] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 30 00:23:45.295659 containerd[1627]: 2025-10-30 00:23:45.271 [INFO][4617] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" HandleID="k8s-pod-network.6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Workload="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" Oct 30 00:23:45.298598 containerd[1627]: 2025-10-30 00:23:45.276 [INFO][4590] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Namespace="kube-system" Pod="coredns-674b8bbfcf-9dbqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"293dec99-0158-41c1-ba5d-1c31149fed67", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-9dbqd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6c0bfebcc3f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:45.298598 containerd[1627]: 2025-10-30 00:23:45.276 [INFO][4590] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Namespace="kube-system" Pod="coredns-674b8bbfcf-9dbqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" Oct 30 00:23:45.298598 containerd[1627]: 2025-10-30 00:23:45.277 [INFO][4590] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6c0bfebcc3f ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Namespace="kube-system" Pod="coredns-674b8bbfcf-9dbqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" Oct 30 00:23:45.298598 containerd[1627]: 2025-10-30 00:23:45.285 [INFO][4590] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Namespace="kube-system" Pod="coredns-674b8bbfcf-9dbqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" Oct 30 00:23:45.298598 containerd[1627]: 2025-10-30 00:23:45.285 [INFO][4590] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Namespace="kube-system" Pod="coredns-674b8bbfcf-9dbqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"293dec99-0158-41c1-ba5d-1c31149fed67", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0", Pod:"coredns-674b8bbfcf-9dbqd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6c0bfebcc3f", MAC:"a2:7c:9a:f5:5c:bd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:45.298778 containerd[1627]: 2025-10-30 00:23:45.293 [INFO][4590] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" Namespace="kube-system" Pod="coredns-674b8bbfcf-9dbqd" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--9dbqd-eth0" Oct 30 00:23:45.310195 containerd[1627]: time="2025-10-30T00:23:45.310157726Z" level=info msg="connecting to shim 6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0" address="unix:///run/containerd/s/4687cb040b094db9dc51780825b6f85283ca6758305e1f7e4ab37b7307922b93" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:45.331145 systemd[1]: Started cri-containerd-6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0.scope - libcontainer container 6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0. Oct 30 00:23:45.340100 systemd-resolved[1508]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 30 00:23:45.369125 systemd-networkd[1507]: cali07aa15a4522: Link UP Oct 30 00:23:45.369893 systemd-networkd[1507]: cali07aa15a4522: Gained carrier Oct 30 00:23:45.380363 containerd[1627]: time="2025-10-30T00:23:45.380340647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-9dbqd,Uid:293dec99-0158-41c1-ba5d-1c31149fed67,Namespace:kube-system,Attempt:0,} returns sandbox id \"6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0\"" Oct 30 00:23:45.386279 containerd[1627]: time="2025-10-30T00:23:45.386241493Z" level=info msg="CreateContainer within sandbox \"6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.226 [INFO][4594] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--7hzkb-eth0 csi-node-driver- calico-system 54feb653-7cf9-4ac1-9a8d-a45c98b9a230 703 0 2025-10-30 00:23:20 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-7hzkb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali07aa15a4522 [] [] }} ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Namespace="calico-system" Pod="csi-node-driver-7hzkb" WorkloadEndpoint="localhost-k8s-csi--node--driver--7hzkb-" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.226 [INFO][4594] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Namespace="calico-system" Pod="csi-node-driver-7hzkb" WorkloadEndpoint="localhost-k8s-csi--node--driver--7hzkb-eth0" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.259 [INFO][4625] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" HandleID="k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Workload="localhost-k8s-csi--node--driver--7hzkb-eth0" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.260 [INFO][4625] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" HandleID="k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Workload="localhost-k8s-csi--node--driver--7hzkb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f260), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-7hzkb", "timestamp":"2025-10-30 00:23:45.259972618 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.260 [INFO][4625] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.271 [INFO][4625] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.271 [INFO][4625] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.344 [INFO][4625] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.348 [INFO][4625] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.352 [INFO][4625] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.353 [INFO][4625] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.355 [INFO][4625] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.355 [INFO][4625] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.356 [INFO][4625] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415 Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.358 [INFO][4625] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.362 [INFO][4625] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.362 [INFO][4625] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" host="localhost" Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.362 [INFO][4625] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 30 00:23:45.387688 containerd[1627]: 2025-10-30 00:23:45.362 [INFO][4625] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" HandleID="k8s-pod-network.160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Workload="localhost-k8s-csi--node--driver--7hzkb-eth0" Oct 30 00:23:45.388552 containerd[1627]: 2025-10-30 00:23:45.365 [INFO][4594] cni-plugin/k8s.go 418: Populated endpoint ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Namespace="calico-system" Pod="csi-node-driver-7hzkb" WorkloadEndpoint="localhost-k8s-csi--node--driver--7hzkb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7hzkb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"54feb653-7cf9-4ac1-9a8d-a45c98b9a230", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-7hzkb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali07aa15a4522", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:45.388552 containerd[1627]: 2025-10-30 00:23:45.365 [INFO][4594] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Namespace="calico-system" Pod="csi-node-driver-7hzkb" WorkloadEndpoint="localhost-k8s-csi--node--driver--7hzkb-eth0" Oct 30 00:23:45.388552 containerd[1627]: 2025-10-30 00:23:45.365 [INFO][4594] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali07aa15a4522 ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Namespace="calico-system" Pod="csi-node-driver-7hzkb" WorkloadEndpoint="localhost-k8s-csi--node--driver--7hzkb-eth0" Oct 30 00:23:45.388552 containerd[1627]: 2025-10-30 00:23:45.369 [INFO][4594] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Namespace="calico-system" Pod="csi-node-driver-7hzkb" WorkloadEndpoint="localhost-k8s-csi--node--driver--7hzkb-eth0" Oct 30 00:23:45.388552 containerd[1627]: 2025-10-30 00:23:45.370 [INFO][4594] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Namespace="calico-system" Pod="csi-node-driver-7hzkb" WorkloadEndpoint="localhost-k8s-csi--node--driver--7hzkb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7hzkb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"54feb653-7cf9-4ac1-9a8d-a45c98b9a230", ResourceVersion:"703", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415", Pod:"csi-node-driver-7hzkb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali07aa15a4522", MAC:"b2:fe:89:ed:07:08", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:45.388552 containerd[1627]: 2025-10-30 00:23:45.382 [INFO][4594] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" Namespace="calico-system" Pod="csi-node-driver-7hzkb" WorkloadEndpoint="localhost-k8s-csi--node--driver--7hzkb-eth0" Oct 30 00:23:45.398424 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3281647582.mount: Deactivated successfully. Oct 30 00:23:45.399368 containerd[1627]: time="2025-10-30T00:23:45.399341599Z" level=info msg="Container fcec6a15dff14e89aeb4f11d649d7ea6acf33900889e9df071045e73a50d2962: CDI devices from CRI Config.CDIDevices: []" Oct 30 00:23:45.412859 containerd[1627]: time="2025-10-30T00:23:45.412824621Z" level=info msg="CreateContainer within sandbox \"6b0d1d0596c53746653f24ee50c120c404c0876a5908e02d2ab3ff0d0893dfa0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fcec6a15dff14e89aeb4f11d649d7ea6acf33900889e9df071045e73a50d2962\"" Oct 30 00:23:45.415626 containerd[1627]: time="2025-10-30T00:23:45.413976865Z" level=info msg="StartContainer for \"fcec6a15dff14e89aeb4f11d649d7ea6acf33900889e9df071045e73a50d2962\"" Oct 30 00:23:45.415626 containerd[1627]: time="2025-10-30T00:23:45.414573973Z" level=info msg="connecting to shim fcec6a15dff14e89aeb4f11d649d7ea6acf33900889e9df071045e73a50d2962" address="unix:///run/containerd/s/4687cb040b094db9dc51780825b6f85283ca6758305e1f7e4ab37b7307922b93" protocol=ttrpc version=3 Oct 30 00:23:45.419183 containerd[1627]: time="2025-10-30T00:23:45.419156555Z" level=info msg="connecting to shim 160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415" address="unix:///run/containerd/s/cb70ab73bdb0cf5b582ee1e34fe674cb3703ef32157cdaa0f29d0cdd5b4f6c73" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:45.432314 systemd[1]: Started cri-containerd-fcec6a15dff14e89aeb4f11d649d7ea6acf33900889e9df071045e73a50d2962.scope - libcontainer container fcec6a15dff14e89aeb4f11d649d7ea6acf33900889e9df071045e73a50d2962. Oct 30 00:23:45.444279 systemd[1]: Started cri-containerd-160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415.scope - libcontainer container 160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415. Oct 30 00:23:45.458478 systemd-resolved[1508]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 30 00:23:45.464429 containerd[1627]: time="2025-10-30T00:23:45.464404878Z" level=info msg="StartContainer for \"fcec6a15dff14e89aeb4f11d649d7ea6acf33900889e9df071045e73a50d2962\" returns successfully" Oct 30 00:23:45.474672 containerd[1627]: time="2025-10-30T00:23:45.474634463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7hzkb,Uid:54feb653-7cf9-4ac1-9a8d-a45c98b9a230,Namespace:calico-system,Attempt:0,} returns sandbox id \"160b2de363035dd1d932541a4540774a7bd89db2ef1a5b736cd9eb247d67d415\"" Oct 30 00:23:45.479003 containerd[1627]: time="2025-10-30T00:23:45.478927361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 30 00:23:45.759619 kubelet[2936]: E1030 00:23:45.759531 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:23:45.760688 kubelet[2936]: E1030 00:23:45.760489 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:23:45.784224 kubelet[2936]: I1030 00:23:45.784085 2936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-9dbqd" podStartSLOduration=38.784072643 podStartE2EDuration="38.784072643s" podCreationTimestamp="2025-10-30 00:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-30 00:23:45.775749295 +0000 UTC m=+44.747789103" watchObservedRunningTime="2025-10-30 00:23:45.784072643 +0000 UTC m=+44.756112439" Oct 30 00:23:45.801802 containerd[1627]: time="2025-10-30T00:23:45.801765411Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:45.802160 containerd[1627]: time="2025-10-30T00:23:45.802126462Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 30 00:23:45.802187 containerd[1627]: time="2025-10-30T00:23:45.802178025Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 30 00:23:45.802330 kubelet[2936]: E1030 00:23:45.802278 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 30 00:23:45.802330 kubelet[2936]: E1030 00:23:45.802319 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 30 00:23:45.803470 kubelet[2936]: E1030 00:23:45.803426 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8rnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-7hzkb_calico-system(54feb653-7cf9-4ac1-9a8d-a45c98b9a230): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:45.805193 containerd[1627]: time="2025-10-30T00:23:45.805174842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 30 00:23:46.129068 containerd[1627]: time="2025-10-30T00:23:46.128803313Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:46.140569 containerd[1627]: time="2025-10-30T00:23:46.140513932Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 30 00:23:46.140569 containerd[1627]: time="2025-10-30T00:23:46.140548496Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 30 00:23:46.141182 kubelet[2936]: E1030 00:23:46.140825 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 30 00:23:46.141182 kubelet[2936]: E1030 00:23:46.140857 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 30 00:23:46.141182 kubelet[2936]: E1030 00:23:46.140944 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8rnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-7hzkb_calico-system(54feb653-7cf9-4ac1-9a8d-a45c98b9a230): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:46.142162 kubelet[2936]: E1030 00:23:46.142136 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:46.166790 containerd[1627]: time="2025-10-30T00:23:46.166761077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86d5586b69-b4v2f,Uid:31c13ead-4475-476c-a22d-7ad5dc225044,Namespace:calico-apiserver,Attempt:0,}" Oct 30 00:23:46.292861 systemd-networkd[1507]: cali0760204f3ea: Gained IPv6LL Oct 30 00:23:46.356147 systemd-networkd[1507]: cali6c0bfebcc3f: Gained IPv6LL Oct 30 00:23:46.378229 systemd-networkd[1507]: calid1ba44e6b1d: Link UP Oct 30 00:23:46.378754 systemd-networkd[1507]: calid1ba44e6b1d: Gained carrier Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.306 [INFO][4776] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0 calico-apiserver-86d5586b69- calico-apiserver 31c13ead-4475-476c-a22d-7ad5dc225044 817 0 2025-10-30 00:23:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:86d5586b69 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-86d5586b69-b4v2f eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid1ba44e6b1d [] [] }} ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-b4v2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.306 [INFO][4776] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-b4v2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.345 [INFO][4788] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" HandleID="k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Workload="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.345 [INFO][4788] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" HandleID="k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Workload="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024ef80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-86d5586b69-b4v2f", "timestamp":"2025-10-30 00:23:46.345202519 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.345 [INFO][4788] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.345 [INFO][4788] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.345 [INFO][4788] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.350 [INFO][4788] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.353 [INFO][4788] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.355 [INFO][4788] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.357 [INFO][4788] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.358 [INFO][4788] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.358 [INFO][4788] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.359 [INFO][4788] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159 Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.369 [INFO][4788] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.374 [INFO][4788] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.374 [INFO][4788] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" host="localhost" Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.374 [INFO][4788] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 30 00:23:46.398445 containerd[1627]: 2025-10-30 00:23:46.374 [INFO][4788] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" HandleID="k8s-pod-network.9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Workload="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" Oct 30 00:23:46.405305 containerd[1627]: 2025-10-30 00:23:46.376 [INFO][4776] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-b4v2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0", GenerateName:"calico-apiserver-86d5586b69-", Namespace:"calico-apiserver", SelfLink:"", UID:"31c13ead-4475-476c-a22d-7ad5dc225044", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86d5586b69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-86d5586b69-b4v2f", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid1ba44e6b1d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:46.405305 containerd[1627]: 2025-10-30 00:23:46.376 [INFO][4776] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-b4v2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" Oct 30 00:23:46.405305 containerd[1627]: 2025-10-30 00:23:46.376 [INFO][4776] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid1ba44e6b1d ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-b4v2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" Oct 30 00:23:46.405305 containerd[1627]: 2025-10-30 00:23:46.379 [INFO][4776] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-b4v2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" Oct 30 00:23:46.405305 containerd[1627]: 2025-10-30 00:23:46.379 [INFO][4776] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-b4v2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0", GenerateName:"calico-apiserver-86d5586b69-", Namespace:"calico-apiserver", SelfLink:"", UID:"31c13ead-4475-476c-a22d-7ad5dc225044", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"86d5586b69", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159", Pod:"calico-apiserver-86d5586b69-b4v2f", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid1ba44e6b1d", MAC:"1a:0a:9e:a1:21:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:46.405305 containerd[1627]: 2025-10-30 00:23:46.396 [INFO][4776] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" Namespace="calico-apiserver" Pod="calico-apiserver-86d5586b69-b4v2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--86d5586b69--b4v2f-eth0" Oct 30 00:23:46.456726 containerd[1627]: time="2025-10-30T00:23:46.456695886Z" level=info msg="connecting to shim 9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159" address="unix:///run/containerd/s/7387ddc78cfae9da097a7f9b27e36c2400416cdd4417d72850c363837b2f8697" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:46.480163 systemd[1]: Started cri-containerd-9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159.scope - libcontainer container 9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159. Oct 30 00:23:46.489073 systemd-resolved[1508]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 30 00:23:46.520166 containerd[1627]: time="2025-10-30T00:23:46.520134416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-86d5586b69-b4v2f,Uid:31c13ead-4475-476c-a22d-7ad5dc225044,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9fe179f66827b32ed0e27b34e60c65e80bab7ae06ae87e2919c9c95918f63159\"" Oct 30 00:23:46.521176 containerd[1627]: time="2025-10-30T00:23:46.521148928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 30 00:23:46.762686 kubelet[2936]: E1030 00:23:46.762568 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:23:46.807118 systemd-networkd[1507]: cali07aa15a4522: Gained IPv6LL Oct 30 00:23:46.864061 containerd[1627]: time="2025-10-30T00:23:46.863997976Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:46.864848 containerd[1627]: time="2025-10-30T00:23:46.864772917Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 30 00:23:46.864848 containerd[1627]: time="2025-10-30T00:23:46.864821599Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 30 00:23:46.864976 kubelet[2936]: E1030 00:23:46.864923 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:23:46.864976 kubelet[2936]: E1030 00:23:46.864959 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:23:46.865186 kubelet[2936]: E1030 00:23:46.865085 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ffscj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-86d5586b69-b4v2f_calico-apiserver(31c13ead-4475-476c-a22d-7ad5dc225044): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:46.866502 kubelet[2936]: E1030 00:23:46.866202 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" podUID="31c13ead-4475-476c-a22d-7ad5dc225044" Oct 30 00:23:47.167463 containerd[1627]: time="2025-10-30T00:23:47.167406995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-8bq2q,Uid:c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4,Namespace:calico-system,Attempt:0,}" Oct 30 00:23:47.233945 systemd-networkd[1507]: calid955e540f1a: Link UP Oct 30 00:23:47.234531 systemd-networkd[1507]: calid955e540f1a: Gained carrier Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.192 [INFO][4852] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--666569f655--8bq2q-eth0 goldmane-666569f655- calico-system c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4 820 0 2025-10-30 00:23:19 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-666569f655-8bq2q eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid955e540f1a [] [] }} ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Namespace="calico-system" Pod="goldmane-666569f655-8bq2q" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--8bq2q-" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.192 [INFO][4852] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Namespace="calico-system" Pod="goldmane-666569f655-8bq2q" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--8bq2q-eth0" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.210 [INFO][4864] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" HandleID="k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Workload="localhost-k8s-goldmane--666569f655--8bq2q-eth0" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.210 [INFO][4864] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" HandleID="k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Workload="localhost-k8s-goldmane--666569f655--8bq2q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f170), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-666569f655-8bq2q", "timestamp":"2025-10-30 00:23:47.210087472 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.210 [INFO][4864] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.210 [INFO][4864] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.210 [INFO][4864] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.215 [INFO][4864] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.219 [INFO][4864] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.222 [INFO][4864] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.223 [INFO][4864] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.224 [INFO][4864] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.224 [INFO][4864] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.225 [INFO][4864] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.227 [INFO][4864] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.230 [INFO][4864] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.230 [INFO][4864] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" host="localhost" Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.230 [INFO][4864] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Oct 30 00:23:47.246667 containerd[1627]: 2025-10-30 00:23:47.230 [INFO][4864] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" HandleID="k8s-pod-network.18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Workload="localhost-k8s-goldmane--666569f655--8bq2q-eth0" Oct 30 00:23:47.247244 containerd[1627]: 2025-10-30 00:23:47.232 [INFO][4852] cni-plugin/k8s.go 418: Populated endpoint ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Namespace="calico-system" Pod="goldmane-666569f655-8bq2q" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--8bq2q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--8bq2q-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-666569f655-8bq2q", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid955e540f1a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:47.247244 containerd[1627]: 2025-10-30 00:23:47.232 [INFO][4852] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Namespace="calico-system" Pod="goldmane-666569f655-8bq2q" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--8bq2q-eth0" Oct 30 00:23:47.247244 containerd[1627]: 2025-10-30 00:23:47.232 [INFO][4852] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid955e540f1a ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Namespace="calico-system" Pod="goldmane-666569f655-8bq2q" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--8bq2q-eth0" Oct 30 00:23:47.247244 containerd[1627]: 2025-10-30 00:23:47.234 [INFO][4852] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Namespace="calico-system" Pod="goldmane-666569f655-8bq2q" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--8bq2q-eth0" Oct 30 00:23:47.247244 containerd[1627]: 2025-10-30 00:23:47.235 [INFO][4852] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Namespace="calico-system" Pod="goldmane-666569f655-8bq2q" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--8bq2q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--666569f655--8bq2q-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.October, 30, 0, 23, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd", Pod:"goldmane-666569f655-8bq2q", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid955e540f1a", MAC:"92:18:8e:a5:cd:4e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 30 00:23:47.247244 containerd[1627]: 2025-10-30 00:23:47.241 [INFO][4852] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" Namespace="calico-system" Pod="goldmane-666569f655-8bq2q" WorkloadEndpoint="localhost-k8s-goldmane--666569f655--8bq2q-eth0" Oct 30 00:23:47.264002 containerd[1627]: time="2025-10-30T00:23:47.263956911Z" level=info msg="connecting to shim 18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd" address="unix:///run/containerd/s/ab95b29ccf14ebc1deaf18da09c4fabdfeb77aa3d7e95a61217318b9e75488c4" namespace=k8s.io protocol=ttrpc version=3 Oct 30 00:23:47.282384 systemd[1]: Started cri-containerd-18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd.scope - libcontainer container 18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd. Oct 30 00:23:47.290435 systemd-resolved[1508]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Oct 30 00:23:47.316975 containerd[1627]: time="2025-10-30T00:23:47.316949768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-8bq2q,Uid:c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4,Namespace:calico-system,Attempt:0,} returns sandbox id \"18f13a1e2598763af6a1f91f4dc6045d11daf2a7a10a03c7382ba78c1ebf69cd\"" Oct 30 00:23:47.356700 containerd[1627]: time="2025-10-30T00:23:47.356674750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 30 00:23:47.672610 containerd[1627]: time="2025-10-30T00:23:47.672569418Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:47.677726 containerd[1627]: time="2025-10-30T00:23:47.677694285Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 30 00:23:47.677819 containerd[1627]: time="2025-10-30T00:23:47.677753415Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 30 00:23:47.677884 kubelet[2936]: E1030 00:23:47.677856 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 30 00:23:47.677925 kubelet[2936]: E1030 00:23:47.677891 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 30 00:23:47.718620 kubelet[2936]: E1030 00:23:47.677981 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfsw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-8bq2q_calico-system(c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:47.719754 kubelet[2936]: E1030 00:23:47.719727 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:23:47.763730 kubelet[2936]: E1030 00:23:47.763706 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:23:47.763999 kubelet[2936]: E1030 00:23:47.763761 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" podUID="31c13ead-4475-476c-a22d-7ad5dc225044" Oct 30 00:23:47.956168 systemd-networkd[1507]: calid1ba44e6b1d: Gained IPv6LL Oct 30 00:23:48.765053 kubelet[2936]: E1030 00:23:48.764956 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:23:49.172200 systemd-networkd[1507]: calid955e540f1a: Gained IPv6LL Oct 30 00:23:56.167854 containerd[1627]: time="2025-10-30T00:23:56.167816528Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 30 00:23:56.536638 containerd[1627]: time="2025-10-30T00:23:56.536546604Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:56.545515 containerd[1627]: time="2025-10-30T00:23:56.542901855Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 30 00:23:56.545515 containerd[1627]: time="2025-10-30T00:23:56.542947764Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 30 00:23:56.545515 containerd[1627]: time="2025-10-30T00:23:56.545064049Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 30 00:23:56.545634 kubelet[2936]: E1030 00:23:56.543069 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 30 00:23:56.545634 kubelet[2936]: E1030 00:23:56.543107 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 30 00:23:56.545634 kubelet[2936]: E1030 00:23:56.543185 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:00f8961b33664e97801f034c95c42fce,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhdtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c95cf544d-r474l_calico-system(ad72706a-e193-45fa-8c06-007a36e881f3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:56.890004 containerd[1627]: time="2025-10-30T00:23:56.889968843Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:56.894944 containerd[1627]: time="2025-10-30T00:23:56.894918479Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 30 00:23:56.895005 containerd[1627]: time="2025-10-30T00:23:56.894969798Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 30 00:23:56.895107 kubelet[2936]: E1030 00:23:56.895081 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 30 00:23:56.895175 kubelet[2936]: E1030 00:23:56.895114 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 30 00:23:56.895286 kubelet[2936]: E1030 00:23:56.895216 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhdtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c95cf544d-r474l_calico-system(ad72706a-e193-45fa-8c06-007a36e881f3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:56.896443 kubelet[2936]: E1030 00:23:56.896409 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-c95cf544d-r474l" podUID="ad72706a-e193-45fa-8c06-007a36e881f3" Oct 30 00:23:57.168134 containerd[1627]: time="2025-10-30T00:23:57.168054703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 30 00:23:57.517218 containerd[1627]: time="2025-10-30T00:23:57.517132981Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:57.517550 containerd[1627]: time="2025-10-30T00:23:57.517525520Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 30 00:23:57.517623 containerd[1627]: time="2025-10-30T00:23:57.517575490Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 30 00:23:57.517699 kubelet[2936]: E1030 00:23:57.517669 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 30 00:23:57.517750 kubelet[2936]: E1030 00:23:57.517705 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 30 00:23:57.518140 kubelet[2936]: E1030 00:23:57.517831 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8m7qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7c8fdc95c4-n9pkj_calico-system(61d50a52-6f08-4848-bb64-ac0260f39fa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:57.519366 kubelet[2936]: E1030 00:23:57.519272 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:23:59.169364 containerd[1627]: time="2025-10-30T00:23:59.169133163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 30 00:23:59.508639 containerd[1627]: time="2025-10-30T00:23:59.508351032Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:23:59.514509 containerd[1627]: time="2025-10-30T00:23:59.514444983Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 30 00:23:59.514509 containerd[1627]: time="2025-10-30T00:23:59.514497097Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 30 00:23:59.514629 kubelet[2936]: E1030 00:23:59.514590 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:23:59.515112 kubelet[2936]: E1030 00:23:59.514634 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:23:59.515112 kubelet[2936]: E1030 00:23:59.514735 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvs96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-86d5586b69-4fl76_calico-apiserver(e46b88e1-56de-4271-a039-c9f0466406b0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 30 00:23:59.516442 kubelet[2936]: E1030 00:23:59.516399 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:24:00.167686 containerd[1627]: time="2025-10-30T00:24:00.167655297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 30 00:24:00.618755 containerd[1627]: time="2025-10-30T00:24:00.618710168Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:00.619097 containerd[1627]: time="2025-10-30T00:24:00.619008606Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 30 00:24:00.619097 containerd[1627]: time="2025-10-30T00:24:00.619046925Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 30 00:24:00.619780 kubelet[2936]: E1030 00:24:00.619215 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:24:00.619780 kubelet[2936]: E1030 00:24:00.619253 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:24:00.619780 kubelet[2936]: E1030 00:24:00.619344 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ffscj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-86d5586b69-b4v2f_calico-apiserver(31c13ead-4475-476c-a22d-7ad5dc225044): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:00.621573 kubelet[2936]: E1030 00:24:00.621343 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" podUID="31c13ead-4475-476c-a22d-7ad5dc225044" Oct 30 00:24:01.168220 containerd[1627]: time="2025-10-30T00:24:01.168195667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 30 00:24:01.508058 containerd[1627]: time="2025-10-30T00:24:01.507876568Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:01.516441 containerd[1627]: time="2025-10-30T00:24:01.516391814Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 30 00:24:01.516503 containerd[1627]: time="2025-10-30T00:24:01.516462816Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 30 00:24:01.516589 kubelet[2936]: E1030 00:24:01.516559 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 30 00:24:01.516629 kubelet[2936]: E1030 00:24:01.516596 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 30 00:24:01.516779 kubelet[2936]: E1030 00:24:01.516720 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8rnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-7hzkb_calico-system(54feb653-7cf9-4ac1-9a8d-a45c98b9a230): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:01.517247 containerd[1627]: time="2025-10-30T00:24:01.517217586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 30 00:24:01.894684 containerd[1627]: time="2025-10-30T00:24:01.894525890Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:01.895129 containerd[1627]: time="2025-10-30T00:24:01.895082071Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 30 00:24:01.895169 containerd[1627]: time="2025-10-30T00:24:01.895149150Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 30 00:24:01.895313 kubelet[2936]: E1030 00:24:01.895280 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 30 00:24:01.895459 kubelet[2936]: E1030 00:24:01.895321 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 30 00:24:01.896055 kubelet[2936]: E1030 00:24:01.895472 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfsw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-8bq2q_calico-system(c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:01.896149 containerd[1627]: time="2025-10-30T00:24:01.895579314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 30 00:24:01.897044 kubelet[2936]: E1030 00:24:01.896957 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:24:02.591532 containerd[1627]: time="2025-10-30T00:24:02.591383724Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:02.593658 containerd[1627]: time="2025-10-30T00:24:02.593638640Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 30 00:24:02.593758 containerd[1627]: time="2025-10-30T00:24:02.593651927Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 30 00:24:02.593881 kubelet[2936]: E1030 00:24:02.593819 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 30 00:24:02.593917 kubelet[2936]: E1030 00:24:02.593889 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 30 00:24:02.593991 kubelet[2936]: E1030 00:24:02.593964 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8rnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-7hzkb_calico-system(54feb653-7cf9-4ac1-9a8d-a45c98b9a230): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:02.595764 kubelet[2936]: E1030 00:24:02.595732 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:24:08.168803 kubelet[2936]: E1030 00:24:08.168747 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-c95cf544d-r474l" podUID="ad72706a-e193-45fa-8c06-007a36e881f3" Oct 30 00:24:10.167563 kubelet[2936]: E1030 00:24:10.167534 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:24:12.525957 containerd[1627]: time="2025-10-30T00:24:12.525905495Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a\" id:\"2f8b64f00c2b7bdd48e185dcf3b3d865f560d0f1c5c5054f2ce891b87ae9cda5\" pid:4976 exited_at:{seconds:1761783852 nanos:525674470}" Oct 30 00:24:13.168078 kubelet[2936]: E1030 00:24:13.167978 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:24:14.167276 kubelet[2936]: E1030 00:24:14.167185 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:24:15.168496 kubelet[2936]: E1030 00:24:15.168429 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" podUID="31c13ead-4475-476c-a22d-7ad5dc225044" Oct 30 00:24:16.168244 kubelet[2936]: E1030 00:24:16.168106 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:24:19.168920 containerd[1627]: time="2025-10-30T00:24:19.168877039Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Oct 30 00:24:19.539472 containerd[1627]: time="2025-10-30T00:24:19.539105301Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:19.545984 containerd[1627]: time="2025-10-30T00:24:19.545961650Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Oct 30 00:24:19.546116 containerd[1627]: time="2025-10-30T00:24:19.546065411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Oct 30 00:24:19.546577 kubelet[2936]: E1030 00:24:19.546243 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 30 00:24:19.546577 kubelet[2936]: E1030 00:24:19.546280 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Oct 30 00:24:19.546577 kubelet[2936]: E1030 00:24:19.546363 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:00f8961b33664e97801f034c95c42fce,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhdtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c95cf544d-r474l_calico-system(ad72706a-e193-45fa-8c06-007a36e881f3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:19.557438 containerd[1627]: time="2025-10-30T00:24:19.548653303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Oct 30 00:24:19.883466 containerd[1627]: time="2025-10-30T00:24:19.883293293Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:19.888005 containerd[1627]: time="2025-10-30T00:24:19.887909634Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Oct 30 00:24:19.888005 containerd[1627]: time="2025-10-30T00:24:19.887982575Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Oct 30 00:24:19.892720 kubelet[2936]: E1030 00:24:19.888272 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 30 00:24:19.892720 kubelet[2936]: E1030 00:24:19.888304 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Oct 30 00:24:19.892720 kubelet[2936]: E1030 00:24:19.888398 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhdtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c95cf544d-r474l_calico-system(ad72706a-e193-45fa-8c06-007a36e881f3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:19.892720 kubelet[2936]: E1030 00:24:19.889726 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-c95cf544d-r474l" podUID="ad72706a-e193-45fa-8c06-007a36e881f3" Oct 30 00:24:20.361491 systemd[1]: Started sshd@7-139.178.70.100:22-139.178.89.65:58562.service - OpenSSH per-connection server daemon (139.178.89.65:58562). Oct 30 00:24:20.625969 sshd[4994]: Accepted publickey for core from 139.178.89.65 port 58562 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:20.627906 sshd-session[4994]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:20.632453 systemd-logind[1606]: New session 10 of user core. Oct 30 00:24:20.641353 systemd[1]: Started session-10.scope - Session 10 of User core. Oct 30 00:24:21.755329 sshd[4997]: Connection closed by 139.178.89.65 port 58562 Oct 30 00:24:21.755248 sshd-session[4994]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:21.760482 systemd[1]: sshd@7-139.178.70.100:22-139.178.89.65:58562.service: Deactivated successfully. Oct 30 00:24:21.761944 systemd[1]: session-10.scope: Deactivated successfully. Oct 30 00:24:21.765162 systemd-logind[1606]: Session 10 logged out. Waiting for processes to exit. Oct 30 00:24:21.765921 systemd-logind[1606]: Removed session 10. Oct 30 00:24:21.801088 systemd[1]: Started sshd@8-139.178.70.100:22-205.210.31.206:49503.service - OpenSSH per-connection server daemon (205.210.31.206:49503). Oct 30 00:24:22.102290 sshd[5011]: Connection closed by 205.210.31.206 port 49503 Oct 30 00:24:22.102804 systemd[1]: sshd@8-139.178.70.100:22-205.210.31.206:49503.service: Deactivated successfully. Oct 30 00:24:23.168671 containerd[1627]: time="2025-10-30T00:24:23.168408902Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Oct 30 00:24:23.529491 containerd[1627]: time="2025-10-30T00:24:23.529415589Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:23.536524 containerd[1627]: time="2025-10-30T00:24:23.536486894Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Oct 30 00:24:23.536600 containerd[1627]: time="2025-10-30T00:24:23.536549733Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Oct 30 00:24:23.536663 kubelet[2936]: E1030 00:24:23.536634 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 30 00:24:23.543500 kubelet[2936]: E1030 00:24:23.536670 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Oct 30 00:24:23.543500 kubelet[2936]: E1030 00:24:23.536754 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8m7qj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7c8fdc95c4-n9pkj_calico-system(61d50a52-6f08-4848-bb64-ac0260f39fa4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:23.543500 kubelet[2936]: E1030 00:24:23.538080 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:24:25.168979 containerd[1627]: time="2025-10-30T00:24:25.168490528Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 30 00:24:25.532331 containerd[1627]: time="2025-10-30T00:24:25.532118892Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:25.532534 containerd[1627]: time="2025-10-30T00:24:25.532516238Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 30 00:24:25.533079 containerd[1627]: time="2025-10-30T00:24:25.532578443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 30 00:24:25.533668 kubelet[2936]: E1030 00:24:25.532756 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:24:25.533668 kubelet[2936]: E1030 00:24:25.532788 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:24:25.534162 kubelet[2936]: E1030 00:24:25.534118 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvs96,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-86d5586b69-4fl76_calico-apiserver(e46b88e1-56de-4271-a039-c9f0466406b0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:25.535294 kubelet[2936]: E1030 00:24:25.535266 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:24:26.168680 containerd[1627]: time="2025-10-30T00:24:26.168642251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Oct 30 00:24:26.524634 containerd[1627]: time="2025-10-30T00:24:26.524420323Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:26.525272 containerd[1627]: time="2025-10-30T00:24:26.525249238Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Oct 30 00:24:26.525399 containerd[1627]: time="2025-10-30T00:24:26.525316677Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Oct 30 00:24:26.525751 kubelet[2936]: E1030 00:24:26.525439 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 30 00:24:26.525751 kubelet[2936]: E1030 00:24:26.525470 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Oct 30 00:24:26.525751 kubelet[2936]: E1030 00:24:26.525549 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8rnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-7hzkb_calico-system(54feb653-7cf9-4ac1-9a8d-a45c98b9a230): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:26.527855 containerd[1627]: time="2025-10-30T00:24:26.527654051Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Oct 30 00:24:26.764772 systemd[1]: Started sshd@9-139.178.70.100:22-139.178.89.65:35262.service - OpenSSH per-connection server daemon (139.178.89.65:35262). Oct 30 00:24:26.824735 sshd[5022]: Accepted publickey for core from 139.178.89.65 port 35262 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:26.825913 sshd-session[5022]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:26.829432 systemd-logind[1606]: New session 11 of user core. Oct 30 00:24:26.833132 systemd[1]: Started session-11.scope - Session 11 of User core. Oct 30 00:24:26.892682 containerd[1627]: time="2025-10-30T00:24:26.892651598Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:26.893065 containerd[1627]: time="2025-10-30T00:24:26.893026284Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Oct 30 00:24:26.893161 containerd[1627]: time="2025-10-30T00:24:26.893150463Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Oct 30 00:24:26.893429 kubelet[2936]: E1030 00:24:26.893398 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 30 00:24:26.893606 kubelet[2936]: E1030 00:24:26.893434 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Oct 30 00:24:26.893606 kubelet[2936]: E1030 00:24:26.893512 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8rnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-7hzkb_calico-system(54feb653-7cf9-4ac1-9a8d-a45c98b9a230): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:26.894915 kubelet[2936]: E1030 00:24:26.894892 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:24:26.959257 sshd[5025]: Connection closed by 139.178.89.65 port 35262 Oct 30 00:24:26.960156 sshd-session[5022]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:26.962628 systemd-logind[1606]: Session 11 logged out. Waiting for processes to exit. Oct 30 00:24:26.962806 systemd[1]: sshd@9-139.178.70.100:22-139.178.89.65:35262.service: Deactivated successfully. Oct 30 00:24:26.964262 systemd[1]: session-11.scope: Deactivated successfully. Oct 30 00:24:26.965265 systemd-logind[1606]: Removed session 11. Oct 30 00:24:27.169322 containerd[1627]: time="2025-10-30T00:24:27.169241092Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Oct 30 00:24:27.495419 containerd[1627]: time="2025-10-30T00:24:27.495334862Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:27.495706 containerd[1627]: time="2025-10-30T00:24:27.495686338Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Oct 30 00:24:27.495759 containerd[1627]: time="2025-10-30T00:24:27.495742327Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Oct 30 00:24:27.495888 kubelet[2936]: E1030 00:24:27.495842 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:24:27.495936 kubelet[2936]: E1030 00:24:27.495894 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Oct 30 00:24:27.496654 kubelet[2936]: E1030 00:24:27.496195 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ffscj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-86d5586b69-b4v2f_calico-apiserver(31c13ead-4475-476c-a22d-7ad5dc225044): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:27.496752 containerd[1627]: time="2025-10-30T00:24:27.496264753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Oct 30 00:24:27.498101 kubelet[2936]: E1030 00:24:27.498074 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" podUID="31c13ead-4475-476c-a22d-7ad5dc225044" Oct 30 00:24:27.808971 containerd[1627]: time="2025-10-30T00:24:27.808945200Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Oct 30 00:24:27.809270 containerd[1627]: time="2025-10-30T00:24:27.809252873Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Oct 30 00:24:27.809330 containerd[1627]: time="2025-10-30T00:24:27.809301833Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Oct 30 00:24:27.809451 kubelet[2936]: E1030 00:24:27.809405 2936 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 30 00:24:27.809510 kubelet[2936]: E1030 00:24:27.809458 2936 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Oct 30 00:24:27.809774 kubelet[2936]: E1030 00:24:27.809559 2936 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfsw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-8bq2q_calico-system(c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Oct 30 00:24:27.810703 kubelet[2936]: E1030 00:24:27.810678 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:24:31.974499 systemd[1]: Started sshd@10-139.178.70.100:22-139.178.89.65:35270.service - OpenSSH per-connection server daemon (139.178.89.65:35270). Oct 30 00:24:32.035295 sshd[5040]: Accepted publickey for core from 139.178.89.65 port 35270 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:32.036623 sshd-session[5040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:32.040371 systemd-logind[1606]: New session 12 of user core. Oct 30 00:24:32.050221 systemd[1]: Started session-12.scope - Session 12 of User core. Oct 30 00:24:32.167772 kubelet[2936]: E1030 00:24:32.167745 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-c95cf544d-r474l" podUID="ad72706a-e193-45fa-8c06-007a36e881f3" Oct 30 00:24:32.187576 sshd[5043]: Connection closed by 139.178.89.65 port 35270 Oct 30 00:24:32.188231 sshd-session[5040]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:32.195373 systemd[1]: sshd@10-139.178.70.100:22-139.178.89.65:35270.service: Deactivated successfully. Oct 30 00:24:32.197092 systemd[1]: session-12.scope: Deactivated successfully. Oct 30 00:24:32.197794 systemd-logind[1606]: Session 12 logged out. Waiting for processes to exit. Oct 30 00:24:32.199578 systemd-logind[1606]: Removed session 12. Oct 30 00:24:32.204317 systemd[1]: Started sshd@11-139.178.70.100:22-139.178.89.65:35276.service - OpenSSH per-connection server daemon (139.178.89.65:35276). Oct 30 00:24:32.266787 sshd[5056]: Accepted publickey for core from 139.178.89.65 port 35276 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:32.268823 sshd-session[5056]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:32.273204 systemd-logind[1606]: New session 13 of user core. Oct 30 00:24:32.278247 systemd[1]: Started session-13.scope - Session 13 of User core. Oct 30 00:24:32.453877 sshd[5061]: Connection closed by 139.178.89.65 port 35276 Oct 30 00:24:32.453808 sshd-session[5056]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:32.463567 systemd[1]: sshd@11-139.178.70.100:22-139.178.89.65:35276.service: Deactivated successfully. Oct 30 00:24:32.466733 systemd[1]: session-13.scope: Deactivated successfully. Oct 30 00:24:32.469065 systemd-logind[1606]: Session 13 logged out. Waiting for processes to exit. Oct 30 00:24:32.475385 systemd[1]: Started sshd@12-139.178.70.100:22-139.178.89.65:35290.service - OpenSSH per-connection server daemon (139.178.89.65:35290). Oct 30 00:24:32.477366 systemd-logind[1606]: Removed session 13. Oct 30 00:24:32.555508 sshd[5071]: Accepted publickey for core from 139.178.89.65 port 35290 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:32.558483 sshd-session[5071]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:32.565099 systemd-logind[1606]: New session 14 of user core. Oct 30 00:24:32.570389 systemd[1]: Started session-14.scope - Session 14 of User core. Oct 30 00:24:32.696558 sshd[5074]: Connection closed by 139.178.89.65 port 35290 Oct 30 00:24:32.697016 sshd-session[5071]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:32.700001 systemd[1]: sshd@12-139.178.70.100:22-139.178.89.65:35290.service: Deactivated successfully. Oct 30 00:24:32.702553 systemd[1]: session-14.scope: Deactivated successfully. Oct 30 00:24:32.703564 systemd-logind[1606]: Session 14 logged out. Waiting for processes to exit. Oct 30 00:24:32.705418 systemd-logind[1606]: Removed session 14. Oct 30 00:24:36.167451 kubelet[2936]: E1030 00:24:36.167411 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:24:37.167510 kubelet[2936]: E1030 00:24:37.167320 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:24:37.713787 systemd[1]: Started sshd@13-139.178.70.100:22-139.178.89.65:47518.service - OpenSSH per-connection server daemon (139.178.89.65:47518). Oct 30 00:24:37.791076 sshd[5091]: Accepted publickey for core from 139.178.89.65 port 47518 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:37.792012 sshd-session[5091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:37.799100 systemd-logind[1606]: New session 15 of user core. Oct 30 00:24:37.802235 systemd[1]: Started session-15.scope - Session 15 of User core. Oct 30 00:24:37.939142 sshd[5094]: Connection closed by 139.178.89.65 port 47518 Oct 30 00:24:37.939433 sshd-session[5091]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:37.942483 systemd-logind[1606]: Session 15 logged out. Waiting for processes to exit. Oct 30 00:24:37.942709 systemd[1]: sshd@13-139.178.70.100:22-139.178.89.65:47518.service: Deactivated successfully. Oct 30 00:24:37.944164 systemd[1]: session-15.scope: Deactivated successfully. Oct 30 00:24:37.945235 systemd-logind[1606]: Removed session 15. Oct 30 00:24:40.167371 kubelet[2936]: E1030 00:24:40.167331 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:24:41.169054 kubelet[2936]: E1030 00:24:41.168783 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:24:42.167455 kubelet[2936]: E1030 00:24:42.167338 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" podUID="31c13ead-4475-476c-a22d-7ad5dc225044" Oct 30 00:24:42.676811 containerd[1627]: time="2025-10-30T00:24:42.676779320Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56ed2cb79129ea933450bc19e3175d4ed5dbbd3db0391927ff31928a2e9a290a\" id:\"5b228c6d29c0458283e02b7ed20c258306202af127592830839cb03fd48dfb34\" pid:5121 exited_at:{seconds:1761783882 nanos:676556818}" Oct 30 00:24:42.953338 systemd[1]: Started sshd@14-139.178.70.100:22-139.178.89.65:47522.service - OpenSSH per-connection server daemon (139.178.89.65:47522). Oct 30 00:24:43.012803 sshd[5133]: Accepted publickey for core from 139.178.89.65 port 47522 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:43.013792 sshd-session[5133]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:43.016683 systemd-logind[1606]: New session 16 of user core. Oct 30 00:24:43.025374 systemd[1]: Started session-16.scope - Session 16 of User core. Oct 30 00:24:43.125346 sshd[5138]: Connection closed by 139.178.89.65 port 47522 Oct 30 00:24:43.124944 sshd-session[5133]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:43.127259 systemd[1]: sshd@14-139.178.70.100:22-139.178.89.65:47522.service: Deactivated successfully. Oct 30 00:24:43.128870 systemd[1]: session-16.scope: Deactivated successfully. Oct 30 00:24:43.131019 systemd-logind[1606]: Session 16 logged out. Waiting for processes to exit. Oct 30 00:24:43.132284 systemd-logind[1606]: Removed session 16. Oct 30 00:24:43.168881 kubelet[2936]: E1030 00:24:43.168847 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-c95cf544d-r474l" podUID="ad72706a-e193-45fa-8c06-007a36e881f3" Oct 30 00:24:48.137265 systemd[1]: Started sshd@15-139.178.70.100:22-139.178.89.65:42298.service - OpenSSH per-connection server daemon (139.178.89.65:42298). Oct 30 00:24:48.169051 kubelet[2936]: E1030 00:24:48.168561 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:24:48.239393 sshd[5151]: Accepted publickey for core from 139.178.89.65 port 42298 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:48.240655 sshd-session[5151]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:48.245091 systemd-logind[1606]: New session 17 of user core. Oct 30 00:24:48.252640 systemd[1]: Started session-17.scope - Session 17 of User core. Oct 30 00:24:48.410294 sshd[5154]: Connection closed by 139.178.89.65 port 42298 Oct 30 00:24:48.410153 sshd-session[5151]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:48.418742 systemd[1]: sshd@15-139.178.70.100:22-139.178.89.65:42298.service: Deactivated successfully. Oct 30 00:24:48.419942 systemd[1]: session-17.scope: Deactivated successfully. Oct 30 00:24:48.421730 systemd-logind[1606]: Session 17 logged out. Waiting for processes to exit. Oct 30 00:24:48.423443 systemd-logind[1606]: Removed session 17. Oct 30 00:24:48.425355 systemd[1]: Started sshd@16-139.178.70.100:22-139.178.89.65:42304.service - OpenSSH per-connection server daemon (139.178.89.65:42304). Oct 30 00:24:48.491315 sshd[5166]: Accepted publickey for core from 139.178.89.65 port 42304 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:48.492366 sshd-session[5166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:48.496783 systemd-logind[1606]: New session 18 of user core. Oct 30 00:24:48.501110 systemd[1]: Started session-18.scope - Session 18 of User core. Oct 30 00:24:48.944724 sshd[5169]: Connection closed by 139.178.89.65 port 42304 Oct 30 00:24:48.942888 sshd-session[5166]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:48.950685 systemd[1]: Started sshd@17-139.178.70.100:22-139.178.89.65:42314.service - OpenSSH per-connection server daemon (139.178.89.65:42314). Oct 30 00:24:48.950983 systemd[1]: sshd@16-139.178.70.100:22-139.178.89.65:42304.service: Deactivated successfully. Oct 30 00:24:48.954532 systemd[1]: session-18.scope: Deactivated successfully. Oct 30 00:24:48.956484 systemd-logind[1606]: Session 18 logged out. Waiting for processes to exit. Oct 30 00:24:48.958376 systemd-logind[1606]: Removed session 18. Oct 30 00:24:49.031182 sshd[5176]: Accepted publickey for core from 139.178.89.65 port 42314 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:49.033196 sshd-session[5176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:49.037788 systemd-logind[1606]: New session 19 of user core. Oct 30 00:24:49.044236 systemd[1]: Started session-19.scope - Session 19 of User core. Oct 30 00:24:49.621382 sshd[5182]: Connection closed by 139.178.89.65 port 42314 Oct 30 00:24:49.622236 sshd-session[5176]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:49.629580 systemd[1]: sshd@17-139.178.70.100:22-139.178.89.65:42314.service: Deactivated successfully. Oct 30 00:24:49.631763 systemd[1]: session-19.scope: Deactivated successfully. Oct 30 00:24:49.632673 systemd-logind[1606]: Session 19 logged out. Waiting for processes to exit. Oct 30 00:24:49.635327 systemd-logind[1606]: Removed session 19. Oct 30 00:24:49.637558 systemd[1]: Started sshd@18-139.178.70.100:22-139.178.89.65:42320.service - OpenSSH per-connection server daemon (139.178.89.65:42320). Oct 30 00:24:49.703385 sshd[5197]: Accepted publickey for core from 139.178.89.65 port 42320 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:49.705430 sshd-session[5197]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:49.711062 systemd-logind[1606]: New session 20 of user core. Oct 30 00:24:49.715179 systemd[1]: Started session-20.scope - Session 20 of User core. Oct 30 00:24:49.964498 sshd[5202]: Connection closed by 139.178.89.65 port 42320 Oct 30 00:24:49.963545 sshd-session[5197]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:49.970244 systemd[1]: sshd@18-139.178.70.100:22-139.178.89.65:42320.service: Deactivated successfully. Oct 30 00:24:49.973097 systemd[1]: session-20.scope: Deactivated successfully. Oct 30 00:24:49.974478 systemd-logind[1606]: Session 20 logged out. Waiting for processes to exit. Oct 30 00:24:49.980287 systemd[1]: Started sshd@19-139.178.70.100:22-139.178.89.65:42328.service - OpenSSH per-connection server daemon (139.178.89.65:42328). Oct 30 00:24:49.981434 systemd-logind[1606]: Removed session 20. Oct 30 00:24:50.031299 sshd[5212]: Accepted publickey for core from 139.178.89.65 port 42328 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:50.032604 sshd-session[5212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:50.038066 systemd-logind[1606]: New session 21 of user core. Oct 30 00:24:50.041176 systemd[1]: Started session-21.scope - Session 21 of User core. Oct 30 00:24:50.167559 sshd[5215]: Connection closed by 139.178.89.65 port 42328 Oct 30 00:24:50.168326 sshd-session[5212]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:50.171732 systemd[1]: sshd@19-139.178.70.100:22-139.178.89.65:42328.service: Deactivated successfully. Oct 30 00:24:50.173422 systemd[1]: session-21.scope: Deactivated successfully. Oct 30 00:24:50.175076 systemd-logind[1606]: Session 21 logged out. Waiting for processes to exit. Oct 30 00:24:50.176445 systemd-logind[1606]: Removed session 21. Oct 30 00:24:52.167748 kubelet[2936]: E1030 00:24:52.167630 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:24:53.170762 kubelet[2936]: E1030 00:24:53.170729 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:24:54.168071 kubelet[2936]: E1030 00:24:54.167816 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-b4v2f" podUID="31c13ead-4475-476c-a22d-7ad5dc225044" Oct 30 00:24:55.171994 kubelet[2936]: E1030 00:24:55.171912 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-8bq2q" podUID="c7da7a4a-cbcd-4d6d-82c2-808e14afe4d4" Oct 30 00:24:55.181706 systemd[1]: Started sshd@20-139.178.70.100:22-139.178.89.65:42342.service - OpenSSH per-connection server daemon (139.178.89.65:42342). Oct 30 00:24:55.235162 sshd[5229]: Accepted publickey for core from 139.178.89.65 port 42342 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:24:55.236019 sshd-session[5229]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:24:55.238697 systemd-logind[1606]: New session 22 of user core. Oct 30 00:24:55.245129 systemd[1]: Started session-22.scope - Session 22 of User core. Oct 30 00:24:55.375797 sshd[5232]: Connection closed by 139.178.89.65 port 42342 Oct 30 00:24:55.375212 sshd-session[5229]: pam_unix(sshd:session): session closed for user core Oct 30 00:24:55.378914 systemd-logind[1606]: Session 22 logged out. Waiting for processes to exit. Oct 30 00:24:55.379061 systemd[1]: sshd@20-139.178.70.100:22-139.178.89.65:42342.service: Deactivated successfully. Oct 30 00:24:55.381616 systemd[1]: session-22.scope: Deactivated successfully. Oct 30 00:24:55.383501 systemd-logind[1606]: Removed session 22. Oct 30 00:24:57.170517 kubelet[2936]: E1030 00:24:57.170413 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-c95cf544d-r474l" podUID="ad72706a-e193-45fa-8c06-007a36e881f3" Oct 30 00:25:00.168053 kubelet[2936]: E1030 00:25:00.167612 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7c8fdc95c4-n9pkj" podUID="61d50a52-6f08-4848-bb64-ac0260f39fa4" Oct 30 00:25:00.383867 systemd[1]: Started sshd@21-139.178.70.100:22-139.178.89.65:54820.service - OpenSSH per-connection server daemon (139.178.89.65:54820). Oct 30 00:25:00.431064 sshd[5244]: Accepted publickey for core from 139.178.89.65 port 54820 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:25:00.430682 sshd-session[5244]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:25:00.436099 systemd-logind[1606]: New session 23 of user core. Oct 30 00:25:00.442391 systemd[1]: Started session-23.scope - Session 23 of User core. Oct 30 00:25:00.563257 sshd[5247]: Connection closed by 139.178.89.65 port 54820 Oct 30 00:25:00.563794 sshd-session[5244]: pam_unix(sshd:session): session closed for user core Oct 30 00:25:00.567293 systemd[1]: sshd@21-139.178.70.100:22-139.178.89.65:54820.service: Deactivated successfully. Oct 30 00:25:00.569716 systemd[1]: session-23.scope: Deactivated successfully. Oct 30 00:25:00.573181 systemd-logind[1606]: Session 23 logged out. Waiting for processes to exit. Oct 30 00:25:00.574209 systemd-logind[1606]: Removed session 23. Oct 30 00:25:05.168066 kubelet[2936]: E1030 00:25:05.167815 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-86d5586b69-4fl76" podUID="e46b88e1-56de-4271-a039-c9f0466406b0" Oct 30 00:25:05.168485 kubelet[2936]: E1030 00:25:05.168392 2936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-7hzkb" podUID="54feb653-7cf9-4ac1-9a8d-a45c98b9a230" Oct 30 00:25:05.578494 systemd[1]: Started sshd@22-139.178.70.100:22-139.178.89.65:54830.service - OpenSSH per-connection server daemon (139.178.89.65:54830). Oct 30 00:25:05.629011 sshd[5269]: Accepted publickey for core from 139.178.89.65 port 54830 ssh2: RSA SHA256:B7vdsv+EQ7dta6Pu+3JtAvdwJv1+G1tAggn7MZR/5Ts Oct 30 00:25:05.630109 sshd-session[5269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 30 00:25:05.634663 systemd-logind[1606]: New session 24 of user core. Oct 30 00:25:05.637114 systemd[1]: Started session-24.scope - Session 24 of User core. Oct 30 00:25:05.755046 sshd[5272]: Connection closed by 139.178.89.65 port 54830 Oct 30 00:25:05.757170 sshd-session[5269]: pam_unix(sshd:session): session closed for user core Oct 30 00:25:05.759134 systemd[1]: sshd@22-139.178.70.100:22-139.178.89.65:54830.service: Deactivated successfully. Oct 30 00:25:05.760323 systemd[1]: session-24.scope: Deactivated successfully. Oct 30 00:25:05.762749 systemd-logind[1606]: Session 24 logged out. Waiting for processes to exit. Oct 30 00:25:05.763368 systemd-logind[1606]: Removed session 24.