May 27 17:40:09.710545 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 27 15:32:02 -00 2025 May 27 17:40:09.710561 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:40:09.710568 kernel: Disabled fast string operations May 27 17:40:09.710572 kernel: BIOS-provided physical RAM map: May 27 17:40:09.710576 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable May 27 17:40:09.710580 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved May 27 17:40:09.710586 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved May 27 17:40:09.710590 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable May 27 17:40:09.710595 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data May 27 17:40:09.710599 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS May 27 17:40:09.710603 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable May 27 17:40:09.710608 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved May 27 17:40:09.710612 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved May 27 17:40:09.710616 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved May 27 17:40:09.710622 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved May 27 17:40:09.710627 kernel: NX (Execute Disable) protection: active May 27 17:40:09.710632 kernel: APIC: Static calls initialized May 27 17:40:09.710637 kernel: SMBIOS 2.7 present. May 27 17:40:09.710642 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 May 27 17:40:09.710647 kernel: DMI: Memory slots populated: 1/128 May 27 17:40:09.710652 kernel: vmware: hypercall mode: 0x00 May 27 17:40:09.710657 kernel: Hypervisor detected: VMware May 27 17:40:09.710662 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz May 27 17:40:09.710667 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz May 27 17:40:09.710672 kernel: vmware: using clock offset of 3415544251 ns May 27 17:40:09.710677 kernel: tsc: Detected 3408.000 MHz processor May 27 17:40:09.710682 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 27 17:40:09.710687 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 27 17:40:09.710692 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 May 27 17:40:09.710697 kernel: total RAM covered: 3072M May 27 17:40:09.710703 kernel: Found optimal setting for mtrr clean up May 27 17:40:09.710710 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G May 27 17:40:09.710715 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs May 27 17:40:09.710720 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 27 17:40:09.710725 kernel: Using GB pages for direct mapping May 27 17:40:09.710730 kernel: ACPI: Early table checksum verification disabled May 27 17:40:09.710734 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) May 27 17:40:09.710739 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) May 27 17:40:09.710744 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) May 27 17:40:09.710750 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) May 27 17:40:09.710757 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 May 27 17:40:09.710762 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 May 27 17:40:09.710767 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) May 27 17:40:09.710772 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) May 27 17:40:09.710779 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) May 27 17:40:09.710784 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) May 27 17:40:09.710789 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) May 27 17:40:09.710794 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) May 27 17:40:09.710799 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] May 27 17:40:09.710804 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] May 27 17:40:09.710809 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] May 27 17:40:09.710814 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] May 27 17:40:09.710819 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] May 27 17:40:09.710825 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] May 27 17:40:09.710831 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] May 27 17:40:09.710836 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] May 27 17:40:09.710841 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] May 27 17:40:09.710846 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] May 27 17:40:09.710851 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] May 27 17:40:09.710856 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] May 27 17:40:09.710861 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug May 27 17:40:09.710866 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] May 27 17:40:09.710872 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] May 27 17:40:09.710878 kernel: Zone ranges: May 27 17:40:09.710883 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 27 17:40:09.710888 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] May 27 17:40:09.710894 kernel: Normal empty May 27 17:40:09.710899 kernel: Device empty May 27 17:40:09.710904 kernel: Movable zone start for each node May 27 17:40:09.710909 kernel: Early memory node ranges May 27 17:40:09.710914 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] May 27 17:40:09.710919 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] May 27 17:40:09.710924 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] May 27 17:40:09.710930 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] May 27 17:40:09.710935 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 27 17:40:09.710940 kernel: On node 0, zone DMA: 98 pages in unavailable ranges May 27 17:40:09.710945 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges May 27 17:40:09.710950 kernel: ACPI: PM-Timer IO Port: 0x1008 May 27 17:40:09.710955 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) May 27 17:40:09.710960 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) May 27 17:40:09.710965 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) May 27 17:40:09.710970 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) May 27 17:40:09.710976 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) May 27 17:40:09.710982 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) May 27 17:40:09.710987 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) May 27 17:40:09.710992 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) May 27 17:40:09.710997 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) May 27 17:40:09.711002 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) May 27 17:40:09.711007 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) May 27 17:40:09.711012 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) May 27 17:40:09.711017 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) May 27 17:40:09.711022 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) May 27 17:40:09.711028 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) May 27 17:40:09.711033 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) May 27 17:40:09.711038 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) May 27 17:40:09.711043 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) May 27 17:40:09.711048 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) May 27 17:40:09.711053 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) May 27 17:40:09.711058 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) May 27 17:40:09.711063 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) May 27 17:40:09.711068 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) May 27 17:40:09.711074 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) May 27 17:40:09.711079 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) May 27 17:40:09.711084 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) May 27 17:40:09.711089 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) May 27 17:40:09.711094 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) May 27 17:40:09.711099 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) May 27 17:40:09.711104 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) May 27 17:40:09.711109 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) May 27 17:40:09.711114 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) May 27 17:40:09.711119 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) May 27 17:40:09.711125 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) May 27 17:40:09.711130 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) May 27 17:40:09.711135 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) May 27 17:40:09.711140 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) May 27 17:40:09.711145 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) May 27 17:40:09.711150 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) May 27 17:40:09.711155 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) May 27 17:40:09.711165 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) May 27 17:40:09.711170 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) May 27 17:40:09.711175 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) May 27 17:40:09.711181 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) May 27 17:40:09.711187 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) May 27 17:40:09.711192 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) May 27 17:40:09.711197 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) May 27 17:40:09.711203 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) May 27 17:40:09.711208 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) May 27 17:40:09.711213 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) May 27 17:40:09.711219 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) May 27 17:40:09.711225 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) May 27 17:40:09.711231 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) May 27 17:40:09.711236 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) May 27 17:40:09.711241 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) May 27 17:40:09.711246 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) May 27 17:40:09.711252 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) May 27 17:40:09.711257 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) May 27 17:40:09.711262 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) May 27 17:40:09.711268 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) May 27 17:40:09.711273 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) May 27 17:40:09.712903 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) May 27 17:40:09.712912 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) May 27 17:40:09.712917 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) May 27 17:40:09.712923 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) May 27 17:40:09.712928 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) May 27 17:40:09.712934 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) May 27 17:40:09.712939 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) May 27 17:40:09.712945 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) May 27 17:40:09.712950 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) May 27 17:40:09.712955 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) May 27 17:40:09.712963 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) May 27 17:40:09.712968 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) May 27 17:40:09.712974 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) May 27 17:40:09.712979 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) May 27 17:40:09.712985 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) May 27 17:40:09.712990 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) May 27 17:40:09.712996 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) May 27 17:40:09.713001 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) May 27 17:40:09.713006 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) May 27 17:40:09.713013 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) May 27 17:40:09.713018 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) May 27 17:40:09.713024 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) May 27 17:40:09.713029 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) May 27 17:40:09.713035 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) May 27 17:40:09.713040 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) May 27 17:40:09.713046 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) May 27 17:40:09.713053 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) May 27 17:40:09.713058 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) May 27 17:40:09.713063 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) May 27 17:40:09.713070 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) May 27 17:40:09.713075 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) May 27 17:40:09.713081 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) May 27 17:40:09.713087 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) May 27 17:40:09.713092 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) May 27 17:40:09.713098 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) May 27 17:40:09.713103 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) May 27 17:40:09.713108 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) May 27 17:40:09.713114 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) May 27 17:40:09.713119 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) May 27 17:40:09.713126 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) May 27 17:40:09.713131 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) May 27 17:40:09.713137 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) May 27 17:40:09.713142 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) May 27 17:40:09.713147 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) May 27 17:40:09.713153 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) May 27 17:40:09.713158 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) May 27 17:40:09.713164 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) May 27 17:40:09.713169 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) May 27 17:40:09.713176 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) May 27 17:40:09.713181 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) May 27 17:40:09.713187 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) May 27 17:40:09.713192 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) May 27 17:40:09.713197 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) May 27 17:40:09.713203 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) May 27 17:40:09.713208 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) May 27 17:40:09.713214 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) May 27 17:40:09.713219 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) May 27 17:40:09.713224 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) May 27 17:40:09.713231 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) May 27 17:40:09.713236 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) May 27 17:40:09.713241 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) May 27 17:40:09.713247 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) May 27 17:40:09.713252 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) May 27 17:40:09.713258 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) May 27 17:40:09.713263 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) May 27 17:40:09.713268 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) May 27 17:40:09.713274 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) May 27 17:40:09.713287 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 May 27 17:40:09.713295 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) May 27 17:40:09.713300 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 27 17:40:09.713306 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 May 27 17:40:09.713311 kernel: TSC deadline timer available May 27 17:40:09.713317 kernel: CPU topo: Max. logical packages: 128 May 27 17:40:09.713322 kernel: CPU topo: Max. logical dies: 128 May 27 17:40:09.713328 kernel: CPU topo: Max. dies per package: 1 May 27 17:40:09.713333 kernel: CPU topo: Max. threads per core: 1 May 27 17:40:09.713338 kernel: CPU topo: Num. cores per package: 1 May 27 17:40:09.713344 kernel: CPU topo: Num. threads per package: 1 May 27 17:40:09.713350 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs May 27 17:40:09.713356 kernel: [mem 0x80000000-0xefffffff] available for PCI devices May 27 17:40:09.713361 kernel: Booting paravirtualized kernel on VMware hypervisor May 27 17:40:09.713367 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 27 17:40:09.713372 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 May 27 17:40:09.713378 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 May 27 17:40:09.713384 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 May 27 17:40:09.713389 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 May 27 17:40:09.713395 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 May 27 17:40:09.713401 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 May 27 17:40:09.713407 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 May 27 17:40:09.713412 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 May 27 17:40:09.713417 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 May 27 17:40:09.713423 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 May 27 17:40:09.713428 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 May 27 17:40:09.713433 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 May 27 17:40:09.713439 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 May 27 17:40:09.713444 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 May 27 17:40:09.713451 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 May 27 17:40:09.713456 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 May 27 17:40:09.713462 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 May 27 17:40:09.713467 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 May 27 17:40:09.713473 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 May 27 17:40:09.713479 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:40:09.713485 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 17:40:09.713492 kernel: random: crng init done May 27 17:40:09.713497 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes May 27 17:40:09.713503 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes May 27 17:40:09.713508 kernel: printk: log_buf_len min size: 262144 bytes May 27 17:40:09.713514 kernel: printk: log_buf_len: 1048576 bytes May 27 17:40:09.713520 kernel: printk: early log buf free: 245576(93%) May 27 17:40:09.713525 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 17:40:09.713531 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 27 17:40:09.713536 kernel: Fallback order for Node 0: 0 May 27 17:40:09.713543 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 May 27 17:40:09.713548 kernel: Policy zone: DMA32 May 27 17:40:09.713554 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 17:40:09.713559 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 May 27 17:40:09.713565 kernel: ftrace: allocating 40081 entries in 157 pages May 27 17:40:09.713570 kernel: ftrace: allocated 157 pages with 5 groups May 27 17:40:09.713576 kernel: Dynamic Preempt: voluntary May 27 17:40:09.713581 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 17:40:09.713588 kernel: rcu: RCU event tracing is enabled. May 27 17:40:09.713593 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. May 27 17:40:09.713600 kernel: Trampoline variant of Tasks RCU enabled. May 27 17:40:09.713606 kernel: Rude variant of Tasks RCU enabled. May 27 17:40:09.713611 kernel: Tracing variant of Tasks RCU enabled. May 27 17:40:09.713617 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 17:40:09.713622 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 May 27 17:40:09.713628 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 27 17:40:09.713633 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 27 17:40:09.713639 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 27 17:40:09.713644 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 May 27 17:40:09.713651 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. May 27 17:40:09.713657 kernel: Console: colour VGA+ 80x25 May 27 17:40:09.713662 kernel: printk: legacy console [tty0] enabled May 27 17:40:09.713668 kernel: printk: legacy console [ttyS0] enabled May 27 17:40:09.713673 kernel: ACPI: Core revision 20240827 May 27 17:40:09.713679 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns May 27 17:40:09.713684 kernel: APIC: Switch to symmetric I/O mode setup May 27 17:40:09.713690 kernel: x2apic enabled May 27 17:40:09.713695 kernel: APIC: Switched APIC routing to: physical x2apic May 27 17:40:09.713702 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 27 17:40:09.713708 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns May 27 17:40:09.713713 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) May 27 17:40:09.713719 kernel: Disabled fast string operations May 27 17:40:09.713724 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 May 27 17:40:09.713730 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 May 27 17:40:09.713736 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 27 17:40:09.713741 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit May 27 17:40:09.713747 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS May 27 17:40:09.713753 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT May 27 17:40:09.713759 kernel: RETBleed: Mitigation: Enhanced IBRS May 27 17:40:09.713765 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 27 17:40:09.713770 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 27 17:40:09.713776 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode May 27 17:40:09.713781 kernel: SRBDS: Unknown: Dependent on hypervisor status May 27 17:40:09.713787 kernel: GDS: Unknown: Dependent on hypervisor status May 27 17:40:09.713792 kernel: ITS: Mitigation: Aligned branch/return thunks May 27 17:40:09.713798 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 27 17:40:09.713804 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 27 17:40:09.713810 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 27 17:40:09.713815 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 27 17:40:09.713821 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 27 17:40:09.713826 kernel: Freeing SMP alternatives memory: 32K May 27 17:40:09.713832 kernel: pid_max: default: 131072 minimum: 1024 May 27 17:40:09.713837 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 17:40:09.713843 kernel: landlock: Up and running. May 27 17:40:09.713848 kernel: SELinux: Initializing. May 27 17:40:09.713855 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 27 17:40:09.713861 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 27 17:40:09.713866 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) May 27 17:40:09.713872 kernel: Performance Events: Skylake events, core PMU driver. May 27 17:40:09.713877 kernel: core: CPUID marked event: 'cpu cycles' unavailable May 27 17:40:09.713883 kernel: core: CPUID marked event: 'instructions' unavailable May 27 17:40:09.713888 kernel: core: CPUID marked event: 'bus cycles' unavailable May 27 17:40:09.713894 kernel: core: CPUID marked event: 'cache references' unavailable May 27 17:40:09.713900 kernel: core: CPUID marked event: 'cache misses' unavailable May 27 17:40:09.713906 kernel: core: CPUID marked event: 'branch instructions' unavailable May 27 17:40:09.713911 kernel: core: CPUID marked event: 'branch misses' unavailable May 27 17:40:09.713916 kernel: ... version: 1 May 27 17:40:09.713922 kernel: ... bit width: 48 May 27 17:40:09.713927 kernel: ... generic registers: 4 May 27 17:40:09.713933 kernel: ... value mask: 0000ffffffffffff May 27 17:40:09.713938 kernel: ... max period: 000000007fffffff May 27 17:40:09.713944 kernel: ... fixed-purpose events: 0 May 27 17:40:09.713950 kernel: ... event mask: 000000000000000f May 27 17:40:09.713956 kernel: signal: max sigframe size: 1776 May 27 17:40:09.713961 kernel: rcu: Hierarchical SRCU implementation. May 27 17:40:09.713967 kernel: rcu: Max phase no-delay instances is 400. May 27 17:40:09.713973 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level May 27 17:40:09.713978 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 27 17:40:09.713984 kernel: smp: Bringing up secondary CPUs ... May 27 17:40:09.713989 kernel: smpboot: x86: Booting SMP configuration: May 27 17:40:09.713995 kernel: .... node #0, CPUs: #1 May 27 17:40:09.714000 kernel: Disabled fast string operations May 27 17:40:09.714007 kernel: smp: Brought up 1 node, 2 CPUs May 27 17:40:09.714012 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) May 27 17:40:09.714018 kernel: Memory: 1924252K/2096628K available (14336K kernel code, 2430K rwdata, 9952K rodata, 54416K init, 2552K bss, 160992K reserved, 0K cma-reserved) May 27 17:40:09.714024 kernel: devtmpfs: initialized May 27 17:40:09.714029 kernel: x86/mm: Memory block size: 128MB May 27 17:40:09.714035 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) May 27 17:40:09.714040 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 17:40:09.714046 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) May 27 17:40:09.714051 kernel: pinctrl core: initialized pinctrl subsystem May 27 17:40:09.714058 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 17:40:09.714064 kernel: audit: initializing netlink subsys (disabled) May 27 17:40:09.714069 kernel: audit: type=2000 audit(1748367606.281:1): state=initialized audit_enabled=0 res=1 May 27 17:40:09.714075 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 17:40:09.714080 kernel: thermal_sys: Registered thermal governor 'user_space' May 27 17:40:09.714086 kernel: cpuidle: using governor menu May 27 17:40:09.714091 kernel: Simple Boot Flag at 0x36 set to 0x80 May 27 17:40:09.714097 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 17:40:09.714102 kernel: dca service started, version 1.12.1 May 27 17:40:09.714109 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] May 27 17:40:09.714122 kernel: PCI: Using configuration type 1 for base access May 27 17:40:09.714129 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 27 17:40:09.714135 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 17:40:09.714140 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 27 17:40:09.714146 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 17:40:09.714152 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 27 17:40:09.714158 kernel: ACPI: Added _OSI(Module Device) May 27 17:40:09.714165 kernel: ACPI: Added _OSI(Processor Device) May 27 17:40:09.714170 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 17:40:09.714176 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 17:40:09.714182 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 17:40:09.714188 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored May 27 17:40:09.714194 kernel: ACPI: Interpreter enabled May 27 17:40:09.714200 kernel: ACPI: PM: (supports S0 S1 S5) May 27 17:40:09.714206 kernel: ACPI: Using IOAPIC for interrupt routing May 27 17:40:09.714212 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 27 17:40:09.714219 kernel: PCI: Using E820 reservations for host bridge windows May 27 17:40:09.714224 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F May 27 17:40:09.714230 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) May 27 17:40:09.715339 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 17:40:09.715401 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] May 27 17:40:09.715452 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] May 27 17:40:09.715461 kernel: PCI host bridge to bus 0000:00 May 27 17:40:09.715512 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 27 17:40:09.715561 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] May 27 17:40:09.715605 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 27 17:40:09.715649 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 27 17:40:09.715692 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] May 27 17:40:09.715735 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] May 27 17:40:09.715795 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint May 27 17:40:09.715858 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge May 27 17:40:09.715911 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 27 17:40:09.715968 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint May 27 17:40:09.716023 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint May 27 17:40:09.716091 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] May 27 17:40:09.716140 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk May 27 17:40:09.716189 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk May 27 17:40:09.716237 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk May 27 17:40:09.717302 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk May 27 17:40:09.717368 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint May 27 17:40:09.717426 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI May 27 17:40:09.717477 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB May 27 17:40:09.717534 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint May 27 17:40:09.717584 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] May 27 17:40:09.717633 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] May 27 17:40:09.717686 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint May 27 17:40:09.717739 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] May 27 17:40:09.717788 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] May 27 17:40:09.717837 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] May 27 17:40:09.717916 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] May 27 17:40:09.717980 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 27 17:40:09.718033 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge May 27 17:40:09.718082 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) May 27 17:40:09.718133 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] May 27 17:40:09.718182 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] May 27 17:40:09.718229 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] May 27 17:40:09.719347 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.719415 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] May 27 17:40:09.719469 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] May 27 17:40:09.719520 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] May 27 17:40:09.719571 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold May 27 17:40:09.719628 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.719679 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] May 27 17:40:09.719728 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] May 27 17:40:09.719777 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] May 27 17:40:09.719827 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] May 27 17:40:09.719877 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold May 27 17:40:09.719932 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.720003 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] May 27 17:40:09.720053 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] May 27 17:40:09.720103 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] May 27 17:40:09.720153 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] May 27 17:40:09.720203 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold May 27 17:40:09.720259 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.720456 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] May 27 17:40:09.720509 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] May 27 17:40:09.720559 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] May 27 17:40:09.720609 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold May 27 17:40:09.720667 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.720719 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] May 27 17:40:09.720769 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] May 27 17:40:09.720822 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] May 27 17:40:09.720872 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold May 27 17:40:09.720926 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.720977 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] May 27 17:40:09.721032 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] May 27 17:40:09.721088 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] May 27 17:40:09.721148 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold May 27 17:40:09.721207 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.721259 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] May 27 17:40:09.721323 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] May 27 17:40:09.721375 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] May 27 17:40:09.721426 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold May 27 17:40:09.721480 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.721531 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] May 27 17:40:09.721599 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] May 27 17:40:09.721654 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] May 27 17:40:09.721705 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold May 27 17:40:09.721777 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.721831 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] May 27 17:40:09.721886 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] May 27 17:40:09.721936 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] May 27 17:40:09.721989 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold May 27 17:40:09.722043 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.722093 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] May 27 17:40:09.722143 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] May 27 17:40:09.722193 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] May 27 17:40:09.722242 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] May 27 17:40:09.723208 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold May 27 17:40:09.723791 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.723875 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] May 27 17:40:09.723945 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] May 27 17:40:09.723996 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] May 27 17:40:09.724047 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] May 27 17:40:09.724097 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold May 27 17:40:09.724151 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.724205 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] May 27 17:40:09.724260 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] May 27 17:40:09.726498 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] May 27 17:40:09.726559 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold May 27 17:40:09.726616 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.726668 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] May 27 17:40:09.726719 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] May 27 17:40:09.726773 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] May 27 17:40:09.726823 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold May 27 17:40:09.726894 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.726979 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] May 27 17:40:09.727045 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] May 27 17:40:09.727095 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] May 27 17:40:09.727143 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold May 27 17:40:09.727200 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.727253 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] May 27 17:40:09.727330 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] May 27 17:40:09.727389 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] May 27 17:40:09.727450 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold May 27 17:40:09.727506 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.727557 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] May 27 17:40:09.727637 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] May 27 17:40:09.727692 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] May 27 17:40:09.727761 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold May 27 17:40:09.727816 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.727881 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] May 27 17:40:09.727935 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] May 27 17:40:09.727985 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] May 27 17:40:09.728036 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] May 27 17:40:09.728092 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold May 27 17:40:09.728147 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.728199 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] May 27 17:40:09.728250 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] May 27 17:40:09.728318 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] May 27 17:40:09.728372 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] May 27 17:40:09.728423 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold May 27 17:40:09.728478 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.728529 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] May 27 17:40:09.728596 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] May 27 17:40:09.728646 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] May 27 17:40:09.728698 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] May 27 17:40:09.728748 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold May 27 17:40:09.728805 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.728856 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] May 27 17:40:09.728906 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] May 27 17:40:09.728956 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] May 27 17:40:09.729006 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold May 27 17:40:09.729062 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.729114 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] May 27 17:40:09.729164 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] May 27 17:40:09.729214 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] May 27 17:40:09.729264 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold May 27 17:40:09.729343 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.729396 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] May 27 17:40:09.729465 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] May 27 17:40:09.729514 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] May 27 17:40:09.729564 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold May 27 17:40:09.729619 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.729670 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] May 27 17:40:09.729788 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] May 27 17:40:09.731343 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] May 27 17:40:09.731410 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold May 27 17:40:09.731473 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.731528 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] May 27 17:40:09.731580 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] May 27 17:40:09.731632 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] May 27 17:40:09.731684 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold May 27 17:40:09.731739 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.731794 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] May 27 17:40:09.731845 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] May 27 17:40:09.731895 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] May 27 17:40:09.731946 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] May 27 17:40:09.731997 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold May 27 17:40:09.732053 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.732105 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] May 27 17:40:09.732158 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] May 27 17:40:09.732223 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] May 27 17:40:09.732275 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] May 27 17:40:09.732340 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold May 27 17:40:09.732397 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.732449 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] May 27 17:40:09.732500 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] May 27 17:40:09.732553 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] May 27 17:40:09.732604 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold May 27 17:40:09.732660 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.732712 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] May 27 17:40:09.732764 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] May 27 17:40:09.732815 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] May 27 17:40:09.732866 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold May 27 17:40:09.732922 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.732976 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] May 27 17:40:09.733027 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] May 27 17:40:09.733077 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] May 27 17:40:09.733128 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold May 27 17:40:09.733184 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.733235 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] May 27 17:40:09.736365 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] May 27 17:40:09.736443 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] May 27 17:40:09.736503 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold May 27 17:40:09.736563 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.736616 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] May 27 17:40:09.736673 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] May 27 17:40:09.736739 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] May 27 17:40:09.736791 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold May 27 17:40:09.736862 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 27 17:40:09.736919 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] May 27 17:40:09.736971 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] May 27 17:40:09.737022 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] May 27 17:40:09.737073 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold May 27 17:40:09.737129 kernel: pci_bus 0000:01: extended config space not accessible May 27 17:40:09.737182 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 27 17:40:09.737239 kernel: pci_bus 0000:02: extended config space not accessible May 27 17:40:09.737248 kernel: acpiphp: Slot [32] registered May 27 17:40:09.737254 kernel: acpiphp: Slot [33] registered May 27 17:40:09.737260 kernel: acpiphp: Slot [34] registered May 27 17:40:09.737266 kernel: acpiphp: Slot [35] registered May 27 17:40:09.737272 kernel: acpiphp: Slot [36] registered May 27 17:40:09.737299 kernel: acpiphp: Slot [37] registered May 27 17:40:09.737307 kernel: acpiphp: Slot [38] registered May 27 17:40:09.737313 kernel: acpiphp: Slot [39] registered May 27 17:40:09.737321 kernel: acpiphp: Slot [40] registered May 27 17:40:09.737327 kernel: acpiphp: Slot [41] registered May 27 17:40:09.737333 kernel: acpiphp: Slot [42] registered May 27 17:40:09.737339 kernel: acpiphp: Slot [43] registered May 27 17:40:09.737345 kernel: acpiphp: Slot [44] registered May 27 17:40:09.737350 kernel: acpiphp: Slot [45] registered May 27 17:40:09.737356 kernel: acpiphp: Slot [46] registered May 27 17:40:09.737362 kernel: acpiphp: Slot [47] registered May 27 17:40:09.737368 kernel: acpiphp: Slot [48] registered May 27 17:40:09.737374 kernel: acpiphp: Slot [49] registered May 27 17:40:09.737381 kernel: acpiphp: Slot [50] registered May 27 17:40:09.737387 kernel: acpiphp: Slot [51] registered May 27 17:40:09.737393 kernel: acpiphp: Slot [52] registered May 27 17:40:09.737398 kernel: acpiphp: Slot [53] registered May 27 17:40:09.737404 kernel: acpiphp: Slot [54] registered May 27 17:40:09.737410 kernel: acpiphp: Slot [55] registered May 27 17:40:09.737416 kernel: acpiphp: Slot [56] registered May 27 17:40:09.737422 kernel: acpiphp: Slot [57] registered May 27 17:40:09.737428 kernel: acpiphp: Slot [58] registered May 27 17:40:09.737435 kernel: acpiphp: Slot [59] registered May 27 17:40:09.737441 kernel: acpiphp: Slot [60] registered May 27 17:40:09.737446 kernel: acpiphp: Slot [61] registered May 27 17:40:09.737452 kernel: acpiphp: Slot [62] registered May 27 17:40:09.737458 kernel: acpiphp: Slot [63] registered May 27 17:40:09.737516 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) May 27 17:40:09.737567 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) May 27 17:40:09.737618 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) May 27 17:40:09.737669 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) May 27 17:40:09.737721 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) May 27 17:40:09.737772 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) May 27 17:40:09.737830 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint May 27 17:40:09.737901 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] May 27 17:40:09.737956 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] May 27 17:40:09.738008 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] May 27 17:40:09.738060 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold May 27 17:40:09.738115 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' May 27 17:40:09.738168 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] May 27 17:40:09.738221 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] May 27 17:40:09.738274 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] May 27 17:40:09.738652 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] May 27 17:40:09.738708 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] May 27 17:40:09.738762 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] May 27 17:40:09.738818 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] May 27 17:40:09.738872 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] May 27 17:40:09.738932 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint May 27 17:40:09.738987 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] May 27 17:40:09.739231 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] May 27 17:40:09.742241 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] May 27 17:40:09.742316 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] May 27 17:40:09.742383 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] May 27 17:40:09.744044 kernel: pci 0000:0b:00.0: supports D1 D2 May 27 17:40:09.744116 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 17:40:09.744176 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' May 27 17:40:09.744231 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] May 27 17:40:09.744299 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] May 27 17:40:09.744358 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] May 27 17:40:09.744419 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] May 27 17:40:09.744478 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] May 27 17:40:09.744531 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] May 27 17:40:09.744591 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] May 27 17:40:09.744644 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] May 27 17:40:09.744698 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] May 27 17:40:09.744751 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] May 27 17:40:09.744804 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] May 27 17:40:09.744861 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] May 27 17:40:09.744915 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] May 27 17:40:09.744968 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] May 27 17:40:09.745021 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] May 27 17:40:09.745075 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] May 27 17:40:09.745129 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] May 27 17:40:09.745181 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] May 27 17:40:09.745235 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] May 27 17:40:09.745303 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] May 27 17:40:09.745360 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] May 27 17:40:09.745413 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] May 27 17:40:09.745466 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] May 27 17:40:09.745519 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] May 27 17:40:09.745528 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 May 27 17:40:09.745535 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 May 27 17:40:09.745543 kernel: ACPI: PCI: Interrupt link LNKB disabled May 27 17:40:09.745549 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 27 17:40:09.745555 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 May 27 17:40:09.745561 kernel: iommu: Default domain type: Translated May 27 17:40:09.745567 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 27 17:40:09.745573 kernel: PCI: Using ACPI for IRQ routing May 27 17:40:09.745578 kernel: PCI: pci_cache_line_size set to 64 bytes May 27 17:40:09.745585 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] May 27 17:40:09.745591 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] May 27 17:40:09.745643 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device May 27 17:40:09.745694 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible May 27 17:40:09.745744 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 27 17:40:09.745753 kernel: vgaarb: loaded May 27 17:40:09.745760 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 May 27 17:40:09.745766 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter May 27 17:40:09.745772 kernel: clocksource: Switched to clocksource tsc-early May 27 17:40:09.745778 kernel: VFS: Disk quotas dquot_6.6.0 May 27 17:40:09.745784 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 17:40:09.745792 kernel: pnp: PnP ACPI init May 27 17:40:09.745849 kernel: system 00:00: [io 0x1000-0x103f] has been reserved May 27 17:40:09.745897 kernel: system 00:00: [io 0x1040-0x104f] has been reserved May 27 17:40:09.745943 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved May 27 17:40:09.745993 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved May 27 17:40:09.746045 kernel: pnp 00:06: [dma 2] May 27 17:40:09.746097 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved May 27 17:40:09.746144 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved May 27 17:40:09.746190 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved May 27 17:40:09.746198 kernel: pnp: PnP ACPI: found 8 devices May 27 17:40:09.746204 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 27 17:40:09.746211 kernel: NET: Registered PF_INET protocol family May 27 17:40:09.746217 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 17:40:09.746223 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) May 27 17:40:09.746231 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 17:40:09.746237 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) May 27 17:40:09.746243 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) May 27 17:40:09.746249 kernel: TCP: Hash tables configured (established 16384 bind 16384) May 27 17:40:09.746255 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) May 27 17:40:09.746261 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) May 27 17:40:09.746267 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 17:40:09.746273 kernel: NET: Registered PF_XDP protocol family May 27 17:40:09.746341 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 17:40:09.746399 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 May 27 17:40:09.746453 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 27 17:40:09.746507 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 27 17:40:09.746560 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 27 17:40:09.746614 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 May 27 17:40:09.746668 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 May 27 17:40:09.746720 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 May 27 17:40:09.746773 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 May 27 17:40:09.746828 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 May 27 17:40:09.746891 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 May 27 17:40:09.746945 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 May 27 17:40:09.746997 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 May 27 17:40:09.747049 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 May 27 17:40:09.747100 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 May 27 17:40:09.747153 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 May 27 17:40:09.747207 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 May 27 17:40:09.747258 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 May 27 17:40:09.747324 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 May 27 17:40:09.747377 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 May 27 17:40:09.747428 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 May 27 17:40:09.747480 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 May 27 17:40:09.747531 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 May 27 17:40:09.747583 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned May 27 17:40:09.747637 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned May 27 17:40:09.747688 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.747739 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.747790 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.747840 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.747906 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.747958 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748009 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748062 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748113 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748163 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748214 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748264 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748337 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748390 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748441 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748495 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748545 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748596 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748646 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748696 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748747 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748797 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748851 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.748901 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.748951 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.749001 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.749052 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.749102 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.749152 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.749202 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.749255 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.749323 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.749382 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.749440 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.749492 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.749543 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.749594 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.749644 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.749697 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.749749 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.749801 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.751363 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.751423 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.751477 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.751530 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.751582 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.751633 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.751688 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.751739 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.751790 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.751841 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.751892 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.751943 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.751993 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752044 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752095 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752148 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752199 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752251 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752323 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752377 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752428 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752478 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752529 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752579 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752630 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752684 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752735 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752784 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752835 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752886 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.752936 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.752986 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.753036 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.753090 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.753140 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.753190 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.753240 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.753307 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.753360 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.753412 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.753465 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.753516 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space May 27 17:40:09.753566 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign May 27 17:40:09.753619 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 27 17:40:09.753670 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] May 27 17:40:09.753719 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] May 27 17:40:09.753769 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] May 27 17:40:09.753819 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] May 27 17:40:09.753882 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned May 27 17:40:09.753935 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] May 27 17:40:09.753986 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] May 27 17:40:09.754037 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] May 27 17:40:09.754087 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] May 27 17:40:09.754140 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] May 27 17:40:09.754191 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] May 27 17:40:09.754243 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] May 27 17:40:09.754304 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] May 27 17:40:09.754358 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] May 27 17:40:09.754412 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] May 27 17:40:09.754463 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] May 27 17:40:09.754513 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] May 27 17:40:09.754565 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] May 27 17:40:09.754615 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] May 27 17:40:09.754666 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] May 27 17:40:09.754717 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] May 27 17:40:09.754768 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] May 27 17:40:09.754819 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] May 27 17:40:09.754872 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] May 27 17:40:09.754926 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] May 27 17:40:09.754977 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] May 27 17:40:09.755029 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] May 27 17:40:09.755080 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] May 27 17:40:09.755131 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] May 27 17:40:09.755185 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] May 27 17:40:09.755236 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] May 27 17:40:09.756021 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] May 27 17:40:09.756088 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned May 27 17:40:09.756145 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] May 27 17:40:09.756198 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] May 27 17:40:09.756253 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] May 27 17:40:09.756346 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] May 27 17:40:09.756404 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] May 27 17:40:09.756455 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] May 27 17:40:09.756506 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] May 27 17:40:09.756556 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] May 27 17:40:09.756608 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] May 27 17:40:09.756658 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] May 27 17:40:09.756708 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] May 27 17:40:09.756759 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] May 27 17:40:09.756810 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] May 27 17:40:09.756864 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] May 27 17:40:09.756918 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] May 27 17:40:09.756969 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] May 27 17:40:09.757019 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] May 27 17:40:09.757069 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] May 27 17:40:09.757119 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] May 27 17:40:09.757169 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] May 27 17:40:09.757219 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] May 27 17:40:09.757272 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] May 27 17:40:09.757338 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] May 27 17:40:09.757389 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] May 27 17:40:09.757439 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] May 27 17:40:09.757490 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] May 27 17:40:09.757838 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] May 27 17:40:09.757905 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] May 27 17:40:09.757962 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] May 27 17:40:09.758014 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] May 27 17:40:09.758064 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] May 27 17:40:09.758117 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] May 27 17:40:09.758168 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] May 27 17:40:09.758218 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] May 27 17:40:09.758268 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] May 27 17:40:09.758334 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] May 27 17:40:09.758386 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] May 27 17:40:09.758436 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] May 27 17:40:09.758489 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] May 27 17:40:09.759165 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] May 27 17:40:09.759229 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] May 27 17:40:09.759300 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] May 27 17:40:09.759357 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] May 27 17:40:09.759409 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] May 27 17:40:09.759461 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] May 27 17:40:09.759512 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] May 27 17:40:09.759565 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] May 27 17:40:09.759616 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] May 27 17:40:09.759667 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] May 27 17:40:09.759718 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] May 27 17:40:09.759768 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] May 27 17:40:09.759819 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] May 27 17:40:09.759876 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] May 27 17:40:09.759927 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] May 27 17:40:09.759981 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] May 27 17:40:09.760032 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] May 27 17:40:09.760082 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] May 27 17:40:09.760132 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] May 27 17:40:09.760183 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] May 27 17:40:09.760249 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] May 27 17:40:09.760313 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] May 27 17:40:09.760365 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] May 27 17:40:09.760419 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] May 27 17:40:09.760468 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] May 27 17:40:09.760517 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] May 27 17:40:09.760568 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] May 27 17:40:09.760618 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] May 27 17:40:09.760667 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] May 27 17:40:09.760717 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] May 27 17:40:09.760768 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] May 27 17:40:09.760820 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] May 27 17:40:09.760874 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] May 27 17:40:09.760956 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] May 27 17:40:09.761005 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] May 27 17:40:09.761055 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] May 27 17:40:09.761104 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] May 27 17:40:09.761153 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] May 27 17:40:09.761206 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] May 27 17:40:09.761256 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] May 27 17:40:09.761326 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] May 27 17:40:09.761378 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] May 27 17:40:09.762352 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] May 27 17:40:09.762404 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] May 27 17:40:09.762451 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] May 27 17:40:09.762499 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] May 27 17:40:09.762549 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] May 27 17:40:09.762596 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] May 27 17:40:09.762643 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] May 27 17:40:09.762689 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] May 27 17:40:09.762736 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] May 27 17:40:09.762785 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] May 27 17:40:09.762834 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] May 27 17:40:09.762894 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] May 27 17:40:09.762944 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] May 27 17:40:09.762991 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] May 27 17:40:09.763037 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] May 27 17:40:09.763087 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] May 27 17:40:09.763133 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] May 27 17:40:09.763181 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] May 27 17:40:09.763233 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] May 27 17:40:09.764208 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] May 27 17:40:09.764268 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] May 27 17:40:09.764355 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] May 27 17:40:09.764420 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] May 27 17:40:09.764472 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] May 27 17:40:09.764522 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] May 27 17:40:09.764581 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] May 27 17:40:09.765362 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] May 27 17:40:09.765418 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] May 27 17:40:09.765466 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] May 27 17:40:09.765517 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] May 27 17:40:09.765566 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] May 27 17:40:09.765616 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] May 27 17:40:09.765661 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] May 27 17:40:09.765707 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] May 27 17:40:09.765756 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] May 27 17:40:09.765801 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] May 27 17:40:09.765848 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] May 27 17:40:09.765898 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] May 27 17:40:09.765944 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] May 27 17:40:09.765988 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] May 27 17:40:09.766038 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] May 27 17:40:09.766096 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] May 27 17:40:09.766152 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] May 27 17:40:09.766199 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] May 27 17:40:09.766248 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] May 27 17:40:09.766328 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] May 27 17:40:09.766379 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] May 27 17:40:09.766425 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] May 27 17:40:09.766475 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] May 27 17:40:09.766524 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] May 27 17:40:09.766574 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] May 27 17:40:09.766620 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] May 27 17:40:09.766665 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] May 27 17:40:09.766714 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] May 27 17:40:09.766760 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] May 27 17:40:09.766808 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] May 27 17:40:09.766862 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] May 27 17:40:09.766909 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] May 27 17:40:09.766954 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] May 27 17:40:09.767005 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] May 27 17:40:09.767050 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] May 27 17:40:09.767101 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] May 27 17:40:09.767149 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] May 27 17:40:09.767198 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] May 27 17:40:09.767244 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] May 27 17:40:09.767316 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] May 27 17:40:09.767364 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] May 27 17:40:09.767412 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] May 27 17:40:09.767462 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] May 27 17:40:09.767512 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] May 27 17:40:09.767557 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] May 27 17:40:09.767602 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] May 27 17:40:09.767652 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] May 27 17:40:09.767697 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] May 27 17:40:09.767763 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] May 27 17:40:09.767823 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] May 27 17:40:09.767891 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] May 27 17:40:09.767944 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] May 27 17:40:09.767991 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] May 27 17:40:09.768041 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] May 27 17:40:09.768088 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] May 27 17:40:09.768141 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] May 27 17:40:09.768186 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] May 27 17:40:09.768237 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] May 27 17:40:09.768309 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] May 27 17:40:09.768364 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] May 27 17:40:09.768412 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] May 27 17:40:09.768471 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 27 17:40:09.768481 kernel: PCI: CLS 32 bytes, default 64 May 27 17:40:09.768488 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer May 27 17:40:09.768494 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns May 27 17:40:09.768500 kernel: clocksource: Switched to clocksource tsc May 27 17:40:09.768506 kernel: Initialise system trusted keyrings May 27 17:40:09.768512 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 May 27 17:40:09.768518 kernel: Key type asymmetric registered May 27 17:40:09.768526 kernel: Asymmetric key parser 'x509' registered May 27 17:40:09.768532 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 27 17:40:09.768539 kernel: io scheduler mq-deadline registered May 27 17:40:09.768545 kernel: io scheduler kyber registered May 27 17:40:09.768550 kernel: io scheduler bfq registered May 27 17:40:09.768603 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 May 27 17:40:09.768656 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.768709 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 May 27 17:40:09.768763 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.768815 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 May 27 17:40:09.768865 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.768918 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 May 27 17:40:09.768970 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769039 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 May 27 17:40:09.769109 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769162 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 May 27 17:40:09.769216 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769268 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 May 27 17:40:09.769344 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769397 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 May 27 17:40:09.769449 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769500 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 May 27 17:40:09.769551 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769607 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 May 27 17:40:09.769658 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769710 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 May 27 17:40:09.769761 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769813 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 May 27 17:40:09.769874 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.769928 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 May 27 17:40:09.769981 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770033 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 May 27 17:40:09.770083 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770134 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 May 27 17:40:09.770185 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770237 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 May 27 17:40:09.770305 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770364 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 May 27 17:40:09.770416 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770467 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 May 27 17:40:09.770518 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770569 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 May 27 17:40:09.770620 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770674 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 May 27 17:40:09.770726 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770780 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 May 27 17:40:09.770831 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770890 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 May 27 17:40:09.770942 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.770993 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 May 27 17:40:09.771044 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771095 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 May 27 17:40:09.771150 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771202 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 May 27 17:40:09.771253 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771322 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 May 27 17:40:09.771375 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771426 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 May 27 17:40:09.771477 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771528 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 May 27 17:40:09.771583 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771634 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 May 27 17:40:09.771685 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771737 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 May 27 17:40:09.771788 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771840 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 May 27 17:40:09.771890 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.771945 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 May 27 17:40:09.771996 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 27 17:40:09.772007 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 27 17:40:09.772014 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 17:40:09.772020 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 27 17:40:09.772027 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 May 27 17:40:09.772033 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 27 17:40:09.772039 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 27 17:40:09.772338 kernel: rtc_cmos 00:01: registered as rtc0 May 27 17:40:09.772392 kernel: rtc_cmos 00:01: setting system clock to 2025-05-27T17:40:09 UTC (1748367609) May 27 17:40:09.772401 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 27 17:40:09.772447 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram May 27 17:40:09.772456 kernel: intel_pstate: CPU model not supported May 27 17:40:09.772463 kernel: NET: Registered PF_INET6 protocol family May 27 17:40:09.772469 kernel: Segment Routing with IPv6 May 27 17:40:09.772475 kernel: In-situ OAM (IOAM) with IPv6 May 27 17:40:09.772484 kernel: NET: Registered PF_PACKET protocol family May 27 17:40:09.772490 kernel: Key type dns_resolver registered May 27 17:40:09.772496 kernel: IPI shorthand broadcast: enabled May 27 17:40:09.772503 kernel: sched_clock: Marking stable (2691003789, 170222315)->(2874398501, -13172397) May 27 17:40:09.772509 kernel: registered taskstats version 1 May 27 17:40:09.772515 kernel: Loading compiled-in X.509 certificates May 27 17:40:09.772521 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 9507e5c390e18536b38d58c90da64baf0ac9837c' May 27 17:40:09.772528 kernel: Demotion targets for Node 0: null May 27 17:40:09.772534 kernel: Key type .fscrypt registered May 27 17:40:09.772541 kernel: Key type fscrypt-provisioning registered May 27 17:40:09.772547 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 17:40:09.772554 kernel: ima: Allocated hash algorithm: sha1 May 27 17:40:09.772560 kernel: ima: No architecture policies found May 27 17:40:09.772566 kernel: clk: Disabling unused clocks May 27 17:40:09.772572 kernel: Warning: unable to open an initial console. May 27 17:40:09.772579 kernel: Freeing unused kernel image (initmem) memory: 54416K May 27 17:40:09.772586 kernel: Write protecting the kernel read-only data: 24576k May 27 17:40:09.772593 kernel: Freeing unused kernel image (rodata/data gap) memory: 288K May 27 17:40:09.772599 kernel: Run /init as init process May 27 17:40:09.772606 kernel: with arguments: May 27 17:40:09.772612 kernel: /init May 27 17:40:09.772618 kernel: with environment: May 27 17:40:09.772624 kernel: HOME=/ May 27 17:40:09.772630 kernel: TERM=linux May 27 17:40:09.772636 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 17:40:09.772643 systemd[1]: Successfully made /usr/ read-only. May 27 17:40:09.772653 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:40:09.772660 systemd[1]: Detected virtualization vmware. May 27 17:40:09.772666 systemd[1]: Detected architecture x86-64. May 27 17:40:09.772673 systemd[1]: Running in initrd. May 27 17:40:09.772679 systemd[1]: No hostname configured, using default hostname. May 27 17:40:09.772686 systemd[1]: Hostname set to . May 27 17:40:09.772692 systemd[1]: Initializing machine ID from random generator. May 27 17:40:09.772698 systemd[1]: Queued start job for default target initrd.target. May 27 17:40:09.772706 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:40:09.772713 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:40:09.772721 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 17:40:09.772727 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:40:09.772733 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 17:40:09.772740 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 17:40:09.772747 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 17:40:09.772755 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 17:40:09.772762 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:40:09.772768 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:40:09.772775 systemd[1]: Reached target paths.target - Path Units. May 27 17:40:09.772781 systemd[1]: Reached target slices.target - Slice Units. May 27 17:40:09.772788 systemd[1]: Reached target swap.target - Swaps. May 27 17:40:09.772794 systemd[1]: Reached target timers.target - Timer Units. May 27 17:40:09.772801 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:40:09.772808 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:40:09.772815 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 17:40:09.772821 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 17:40:09.772828 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:40:09.772834 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:40:09.772841 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:40:09.772848 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:40:09.772854 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 17:40:09.772861 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:40:09.772868 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 17:40:09.772875 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 17:40:09.772882 systemd[1]: Starting systemd-fsck-usr.service... May 27 17:40:09.772888 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:40:09.772895 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:40:09.772901 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:40:09.772908 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 17:40:09.772915 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:40:09.772934 systemd-journald[243]: Collecting audit messages is disabled. May 27 17:40:09.772951 systemd[1]: Finished systemd-fsck-usr.service. May 27 17:40:09.772959 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:40:09.772966 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:40:09.772972 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 17:40:09.772979 kernel: Bridge firewalling registered May 27 17:40:09.772985 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:40:09.772991 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:40:09.772999 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:40:09.773006 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:40:09.773012 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 17:40:09.773019 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:40:09.773025 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:40:09.773033 systemd-journald[243]: Journal started May 27 17:40:09.773049 systemd-journald[243]: Runtime Journal (/run/log/journal/877e891a489e4691bc552c46b39785d7) is 4.8M, max 38.8M, 34M free. May 27 17:40:09.722504 systemd-modules-load[245]: Inserted module 'overlay' May 27 17:40:09.744167 systemd-modules-load[245]: Inserted module 'br_netfilter' May 27 17:40:09.774291 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:40:09.777016 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:40:09.781426 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:40:09.782347 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 17:40:09.784704 systemd-tmpfiles[274]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 17:40:09.786558 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:40:09.788348 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:40:09.792407 dracut-cmdline[282]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:40:09.818174 systemd-resolved[285]: Positive Trust Anchors: May 27 17:40:09.818184 systemd-resolved[285]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:40:09.818207 systemd-resolved[285]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:40:09.820613 systemd-resolved[285]: Defaulting to hostname 'linux'. May 27 17:40:09.821567 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:40:09.821701 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:40:09.851299 kernel: SCSI subsystem initialized May 27 17:40:09.868299 kernel: Loading iSCSI transport class v2.0-870. May 27 17:40:09.876294 kernel: iscsi: registered transport (tcp) May 27 17:40:09.899484 kernel: iscsi: registered transport (qla4xxx) May 27 17:40:09.899526 kernel: QLogic iSCSI HBA Driver May 27 17:40:09.910145 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:40:09.919848 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:40:09.920964 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:40:09.943332 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 17:40:09.944271 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 17:40:09.981291 kernel: raid6: avx2x4 gen() 47723 MB/s May 27 17:40:09.998292 kernel: raid6: avx2x2 gen() 53210 MB/s May 27 17:40:10.015491 kernel: raid6: avx2x1 gen() 44812 MB/s May 27 17:40:10.015509 kernel: raid6: using algorithm avx2x2 gen() 53210 MB/s May 27 17:40:10.033474 kernel: raid6: .... xor() 32181 MB/s, rmw enabled May 27 17:40:10.033489 kernel: raid6: using avx2x2 recovery algorithm May 27 17:40:10.047288 kernel: xor: automatically using best checksumming function avx May 27 17:40:10.151302 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 17:40:10.154169 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 17:40:10.155145 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:40:10.171205 systemd-udevd[493]: Using default interface naming scheme 'v255'. May 27 17:40:10.174867 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:40:10.175470 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 17:40:10.191895 dracut-pre-trigger[495]: rd.md=0: removing MD RAID activation May 27 17:40:10.204815 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:40:10.205689 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:40:10.281819 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:40:10.283275 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 17:40:10.355388 kernel: VMware PVSCSI driver - version 1.0.7.0-k May 27 17:40:10.360295 kernel: vmw_pvscsi: using 64bit dma May 27 17:40:10.360316 kernel: vmw_pvscsi: max_id: 16 May 27 17:40:10.360324 kernel: vmw_pvscsi: setting ring_pages to 8 May 27 17:40:10.371298 kernel: vmw_pvscsi: enabling reqCallThreshold May 27 17:40:10.371318 kernel: vmw_pvscsi: driver-based request coalescing enabled May 27 17:40:10.371326 kernel: vmw_pvscsi: using MSI-X May 27 17:40:10.371336 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 May 27 17:40:10.376292 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI May 27 17:40:10.382048 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 May 27 17:40:10.382196 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 May 27 17:40:10.388906 kernel: cryptd: max_cpu_qlen set to 1000 May 27 17:40:10.388927 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 May 27 17:40:10.392290 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps May 27 17:40:10.394334 (udev-worker)[537]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. May 27 17:40:10.399298 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 May 27 17:40:10.399420 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:40:10.399494 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:40:10.404479 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) May 27 17:40:10.404675 kernel: sd 0:0:0:0: [sda] Write Protect is off May 27 17:40:10.404822 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 May 27 17:40:10.406486 kernel: sd 0:0:0:0: [sda] Cache data unavailable May 27 17:40:10.406650 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through May 27 17:40:10.406743 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input2 May 27 17:40:10.407889 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:40:10.408484 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:40:10.418954 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:40:10.418984 kernel: sd 0:0:0:0: [sda] Attached SCSI disk May 27 17:40:10.424912 kernel: AES CTR mode by8 optimization enabled May 27 17:40:10.428288 kernel: libata version 3.00 loaded. May 27 17:40:10.441023 kernel: ata_piix 0000:00:07.1: version 2.13 May 27 17:40:10.441137 kernel: scsi host1: ata_piix May 27 17:40:10.442305 kernel: scsi host2: ata_piix May 27 17:40:10.442381 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 May 27 17:40:10.442391 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 May 27 17:40:10.447197 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:40:10.476345 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. May 27 17:40:10.481775 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. May 27 17:40:10.487088 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. May 27 17:40:10.491512 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. May 27 17:40:10.491780 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. May 27 17:40:10.492474 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 17:40:10.528300 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:40:10.539295 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:40:10.612299 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 May 27 17:40:10.619978 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 May 27 17:40:10.645699 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray May 27 17:40:10.645817 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 27 17:40:10.657290 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 27 17:40:11.002908 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 17:40:11.003361 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:40:11.003533 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:40:11.003799 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:40:11.004634 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 17:40:11.021366 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 17:40:11.585959 disk-uuid[641]: The operation has completed successfully. May 27 17:40:11.587013 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:40:11.775608 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 17:40:11.775665 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 17:40:11.791115 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 17:40:11.800052 sh[672]: Success May 27 17:40:11.813364 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 17:40:11.813386 kernel: device-mapper: uevent: version 1.0.3 May 27 17:40:11.814476 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 17:40:11.821289 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" May 27 17:40:11.862235 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 17:40:11.864315 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 17:40:11.873326 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 17:40:11.885416 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 17:40:11.885446 kernel: BTRFS: device fsid 7caef027-0915-4c01-a3d5-28eff70f7ebd devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (684) May 27 17:40:11.886902 kernel: BTRFS info (device dm-0): first mount of filesystem 7caef027-0915-4c01-a3d5-28eff70f7ebd May 27 17:40:11.888294 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 27 17:40:11.888313 kernel: BTRFS info (device dm-0): using free-space-tree May 27 17:40:11.895567 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 17:40:11.895863 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 17:40:11.896420 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... May 27 17:40:11.897682 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 17:40:11.926300 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (707) May 27 17:40:11.929735 kernel: BTRFS info (device sda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:40:11.929759 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:40:11.929767 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:40:11.938313 kernel: BTRFS info (device sda6): last unmount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:40:11.938898 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 17:40:11.939577 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 17:40:11.972465 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. May 27 17:40:11.973412 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 17:40:12.036163 ignition[726]: Ignition 2.21.0 May 27 17:40:12.036389 ignition[726]: Stage: fetch-offline May 27 17:40:12.036410 ignition[726]: no configs at "/usr/lib/ignition/base.d" May 27 17:40:12.036414 ignition[726]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 27 17:40:12.036462 ignition[726]: parsed url from cmdline: "" May 27 17:40:12.036463 ignition[726]: no config URL provided May 27 17:40:12.036467 ignition[726]: reading system config file "/usr/lib/ignition/user.ign" May 27 17:40:12.036471 ignition[726]: no config at "/usr/lib/ignition/user.ign" May 27 17:40:12.036820 ignition[726]: config successfully fetched May 27 17:40:12.036839 ignition[726]: parsing config with SHA512: 477567f241c0b90060b9a08cadbad8b70c2f910499b4949a6e42f694e6146dea09ea35417806f0b37fd9c50f3c537cc88434225551567a2fbe6027b250b90c15 May 27 17:40:12.041118 unknown[726]: fetched base config from "system" May 27 17:40:12.041124 unknown[726]: fetched user config from "vmware" May 27 17:40:12.041323 ignition[726]: fetch-offline: fetch-offline passed May 27 17:40:12.041356 ignition[726]: Ignition finished successfully May 27 17:40:12.042187 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:40:12.063058 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:40:12.064124 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:40:12.083927 systemd-networkd[864]: lo: Link UP May 27 17:40:12.083933 systemd-networkd[864]: lo: Gained carrier May 27 17:40:12.084620 systemd-networkd[864]: Enumeration completed May 27 17:40:12.084846 systemd-networkd[864]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. May 27 17:40:12.084908 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:40:12.085045 systemd[1]: Reached target network.target - Network. May 27 17:40:12.087715 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated May 27 17:40:12.087818 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps May 27 17:40:12.085135 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 27 17:40:12.085692 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 17:40:12.087808 systemd-networkd[864]: ens192: Link UP May 27 17:40:12.087810 systemd-networkd[864]: ens192: Gained carrier May 27 17:40:12.106426 ignition[867]: Ignition 2.21.0 May 27 17:40:12.106436 ignition[867]: Stage: kargs May 27 17:40:12.106517 ignition[867]: no configs at "/usr/lib/ignition/base.d" May 27 17:40:12.106523 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 27 17:40:12.107323 ignition[867]: kargs: kargs passed May 27 17:40:12.107356 ignition[867]: Ignition finished successfully May 27 17:40:12.108667 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 17:40:12.109544 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 17:40:12.127239 ignition[875]: Ignition 2.21.0 May 27 17:40:12.127249 ignition[875]: Stage: disks May 27 17:40:12.127342 ignition[875]: no configs at "/usr/lib/ignition/base.d" May 27 17:40:12.127348 ignition[875]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 27 17:40:12.128460 ignition[875]: disks: disks passed May 27 17:40:12.128496 ignition[875]: Ignition finished successfully May 27 17:40:12.129420 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 17:40:12.129938 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 17:40:12.130220 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 17:40:12.130498 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:40:12.130732 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:40:12.130938 systemd[1]: Reached target basic.target - Basic System. May 27 17:40:12.131635 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 17:40:12.156463 systemd-fsck[884]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 27 17:40:12.158104 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 17:40:12.159182 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 17:40:12.238189 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 17:40:12.238362 kernel: EXT4-fs (sda9): mounted filesystem bf93e767-f532-4480-b210-a196f7ac181e r/w with ordered data mode. Quota mode: none. May 27 17:40:12.238671 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 17:40:12.239630 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:40:12.241310 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 17:40:12.241692 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 27 17:40:12.241868 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 17:40:12.241882 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:40:12.251664 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 17:40:12.252366 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 17:40:12.258290 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (892) May 27 17:40:12.261626 kernel: BTRFS info (device sda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:40:12.261644 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:40:12.261655 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:40:12.267082 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:40:12.283347 initrd-setup-root[916]: cut: /sysroot/etc/passwd: No such file or directory May 27 17:40:12.285514 initrd-setup-root[923]: cut: /sysroot/etc/group: No such file or directory May 27 17:40:12.287864 initrd-setup-root[930]: cut: /sysroot/etc/shadow: No such file or directory May 27 17:40:12.290297 initrd-setup-root[937]: cut: /sysroot/etc/gshadow: No such file or directory May 27 17:40:12.351878 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 17:40:12.352841 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 17:40:12.354347 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 17:40:12.364286 kernel: BTRFS info (device sda6): last unmount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:40:12.378813 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 17:40:12.379514 ignition[1004]: INFO : Ignition 2.21.0 May 27 17:40:12.379514 ignition[1004]: INFO : Stage: mount May 27 17:40:12.379514 ignition[1004]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:40:12.379514 ignition[1004]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 27 17:40:12.380666 ignition[1004]: INFO : mount: mount passed May 27 17:40:12.380811 ignition[1004]: INFO : Ignition finished successfully May 27 17:40:12.381573 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 17:40:12.382302 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 17:40:12.884161 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 17:40:12.885367 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:40:12.900383 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (1017) May 27 17:40:12.900412 kernel: BTRFS info (device sda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:40:12.903048 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:40:12.903068 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:40:12.907436 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:40:12.923071 ignition[1033]: INFO : Ignition 2.21.0 May 27 17:40:12.923071 ignition[1033]: INFO : Stage: files May 27 17:40:12.923446 ignition[1033]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:40:12.923446 ignition[1033]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 27 17:40:12.923772 ignition[1033]: DEBUG : files: compiled without relabeling support, skipping May 27 17:40:12.924581 ignition[1033]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 17:40:12.924581 ignition[1033]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 17:40:12.925978 ignition[1033]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 17:40:12.926185 ignition[1033]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 17:40:12.926337 unknown[1033]: wrote ssh authorized keys file for user: core May 27 17:40:12.926559 ignition[1033]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 17:40:12.927673 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 27 17:40:12.927863 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 27 17:40:13.001063 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:40:13.343793 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:40:13.345582 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:40:13.345747 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:40:13.345747 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 27 17:40:13.347901 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 27 17:40:13.347901 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 27 17:40:13.348313 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 May 27 17:40:13.981427 systemd-networkd[864]: ens192: Gained IPv6LL May 27 17:40:14.093901 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 17:40:14.316123 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 27 17:40:14.316123 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" May 27 17:40:14.316852 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" May 27 17:40:14.317039 ignition[1033]: INFO : files: op(c): [started] processing unit "prepare-helm.service" May 27 17:40:14.317244 ignition[1033]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:40:14.317561 ignition[1033]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:40:14.317561 ignition[1033]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" May 27 17:40:14.317561 ignition[1033]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" May 27 17:40:14.318032 ignition[1033]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 27 17:40:14.318032 ignition[1033]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 27 17:40:14.318032 ignition[1033]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" May 27 17:40:14.318032 ignition[1033]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" May 27 17:40:14.338003 ignition[1033]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" May 27 17:40:14.339871 ignition[1033]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 27 17:40:14.340055 ignition[1033]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" May 27 17:40:14.340055 ignition[1033]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" May 27 17:40:14.340055 ignition[1033]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" May 27 17:40:14.340055 ignition[1033]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 17:40:14.341171 ignition[1033]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 17:40:14.341171 ignition[1033]: INFO : files: files passed May 27 17:40:14.341171 ignition[1033]: INFO : Ignition finished successfully May 27 17:40:14.341415 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 17:40:14.342246 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 17:40:14.344350 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 17:40:14.347759 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 17:40:14.347975 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 17:40:14.350473 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:40:14.350473 initrd-setup-root-after-ignition[1066]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 17:40:14.351348 initrd-setup-root-after-ignition[1070]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:40:14.352527 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:40:14.352961 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 17:40:14.353670 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 17:40:14.392762 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 17:40:14.392856 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 17:40:14.393203 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 17:40:14.393560 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 17:40:14.393825 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 17:40:14.394429 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 17:40:14.407380 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:40:14.408234 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 17:40:14.417991 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 17:40:14.418251 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:40:14.418569 systemd[1]: Stopped target timers.target - Timer Units. May 27 17:40:14.418822 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 17:40:14.418991 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:40:14.419371 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 17:40:14.419626 systemd[1]: Stopped target basic.target - Basic System. May 27 17:40:14.419838 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 17:40:14.420115 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:40:14.420395 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 17:40:14.420647 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 17:40:14.420923 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 17:40:14.421147 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:40:14.421439 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 17:40:14.421712 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 17:40:14.421959 systemd[1]: Stopped target swap.target - Swaps. May 27 17:40:14.422164 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 17:40:14.422344 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 17:40:14.422680 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 17:40:14.422943 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:40:14.423169 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 17:40:14.423332 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:40:14.423592 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 17:40:14.423656 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 17:40:14.424063 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 17:40:14.424129 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:40:14.424541 systemd[1]: Stopped target paths.target - Path Units. May 27 17:40:14.424745 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 17:40:14.424914 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:40:14.425201 systemd[1]: Stopped target slices.target - Slice Units. May 27 17:40:14.425435 systemd[1]: Stopped target sockets.target - Socket Units. May 27 17:40:14.425660 systemd[1]: iscsid.socket: Deactivated successfully. May 27 17:40:14.425709 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:40:14.426043 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 17:40:14.426097 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:40:14.426463 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 17:40:14.426538 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:40:14.426778 systemd[1]: ignition-files.service: Deactivated successfully. May 27 17:40:14.426835 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 17:40:14.428362 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 17:40:14.428471 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 17:40:14.428536 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:40:14.429064 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 17:40:14.429160 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 17:40:14.429221 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:40:14.429374 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 17:40:14.429429 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:40:14.432175 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 17:40:14.442314 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 17:40:14.448524 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 17:40:14.453700 ignition[1091]: INFO : Ignition 2.21.0 May 27 17:40:14.453700 ignition[1091]: INFO : Stage: umount May 27 17:40:14.453971 ignition[1091]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:40:14.453971 ignition[1091]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 27 17:40:14.454464 ignition[1091]: INFO : umount: umount passed May 27 17:40:14.454560 ignition[1091]: INFO : Ignition finished successfully May 27 17:40:14.455285 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 17:40:14.455370 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 17:40:14.455626 systemd[1]: Stopped target network.target - Network. May 27 17:40:14.455727 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 17:40:14.455753 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 17:40:14.455898 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 17:40:14.455919 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 17:40:14.456069 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 17:40:14.456091 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 17:40:14.456236 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 17:40:14.456255 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 17:40:14.456463 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 17:40:14.456757 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 17:40:14.458641 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 17:40:14.458807 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 17:40:14.460006 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 17:40:14.460152 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 17:40:14.460176 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:40:14.460964 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 17:40:14.464061 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 17:40:14.464128 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 17:40:14.464888 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 17:40:14.464998 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 17:40:14.465137 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 17:40:14.465154 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 17:40:14.465894 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 17:40:14.465981 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 17:40:14.466005 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:40:14.466120 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. May 27 17:40:14.466142 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. May 27 17:40:14.466252 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 17:40:14.466274 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 17:40:14.468512 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 17:40:14.468534 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 17:40:14.469353 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:40:14.470059 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 17:40:14.476096 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 17:40:14.476306 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 17:40:14.480683 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 17:40:14.480891 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:40:14.481265 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 17:40:14.481445 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 17:40:14.481696 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 17:40:14.481823 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:40:14.482076 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 17:40:14.482203 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 17:40:14.482504 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 17:40:14.482629 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 17:40:14.482918 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 17:40:14.482941 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:40:14.483811 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 17:40:14.484060 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 17:40:14.484197 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:40:14.484548 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 17:40:14.484575 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:40:14.485009 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:40:14.485033 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:40:14.496008 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 17:40:14.496068 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 17:40:14.501428 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 17:40:14.501485 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 17:40:14.501747 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 17:40:14.501853 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 17:40:14.501879 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 17:40:14.502455 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 17:40:14.521327 systemd[1]: Switching root. May 27 17:40:14.563327 systemd-journald[243]: Journal stopped May 27 17:40:16.152871 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). May 27 17:40:16.152894 kernel: SELinux: policy capability network_peer_controls=1 May 27 17:40:16.152902 kernel: SELinux: policy capability open_perms=1 May 27 17:40:16.152908 kernel: SELinux: policy capability extended_socket_class=1 May 27 17:40:16.152913 kernel: SELinux: policy capability always_check_network=0 May 27 17:40:16.152920 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 17:40:16.152926 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 17:40:16.152932 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 17:40:16.152938 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 17:40:16.152943 kernel: SELinux: policy capability userspace_initial_context=0 May 27 17:40:16.152949 systemd[1]: Successfully loaded SELinux policy in 79.931ms. May 27 17:40:16.152956 kernel: audit: type=1403 audit(1748367615.385:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 17:40:16.152964 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.288ms. May 27 17:40:16.152971 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:40:16.152978 systemd[1]: Detected virtualization vmware. May 27 17:40:16.152987 systemd[1]: Detected architecture x86-64. May 27 17:40:16.152994 systemd[1]: Detected first boot. May 27 17:40:16.153001 systemd[1]: Initializing machine ID from random generator. May 27 17:40:16.153007 zram_generator::config[1135]: No configuration found. May 27 17:40:16.153091 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc May 27 17:40:16.153102 kernel: Guest personality initialized and is active May 27 17:40:16.153108 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 27 17:40:16.153114 kernel: Initialized host personality May 27 17:40:16.153122 kernel: NET: Registered PF_VSOCK protocol family May 27 17:40:16.153128 systemd[1]: Populated /etc with preset unit settings. May 27 17:40:16.153135 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 27 17:40:16.153143 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" May 27 17:40:16.153149 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 17:40:16.153155 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 17:40:16.153162 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 17:40:16.153169 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 17:40:16.153176 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 17:40:16.153183 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 17:40:16.153189 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 17:40:16.153197 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 17:40:16.153203 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 17:40:16.153210 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 17:40:16.153218 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 17:40:16.153225 systemd[1]: Created slice user.slice - User and Session Slice. May 27 17:40:16.153231 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:40:16.153240 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:40:16.153246 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 17:40:16.153253 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 17:40:16.153260 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 17:40:16.153266 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:40:16.153275 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 27 17:40:16.153288 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:40:16.153296 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:40:16.153302 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 17:40:16.153309 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 17:40:16.153316 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 17:40:16.153323 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 17:40:16.153330 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:40:16.153338 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:40:16.153345 systemd[1]: Reached target slices.target - Slice Units. May 27 17:40:16.153352 systemd[1]: Reached target swap.target - Swaps. May 27 17:40:16.153358 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 17:40:16.153366 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 17:40:16.153373 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 17:40:16.153380 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:40:16.153387 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:40:16.153394 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:40:16.153401 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 17:40:16.153408 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 17:40:16.153415 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 17:40:16.153422 systemd[1]: Mounting media.mount - External Media Directory... May 27 17:40:16.153430 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:40:16.153437 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 17:40:16.153444 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 17:40:16.153450 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 17:40:16.153458 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 17:40:16.153464 systemd[1]: Reached target machines.target - Containers. May 27 17:40:16.153471 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 17:40:16.153478 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... May 27 17:40:16.153486 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:40:16.153492 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 17:40:16.153500 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:40:16.153507 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:40:16.153514 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:40:16.153520 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 17:40:16.153527 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:40:16.153534 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 17:40:16.153542 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 17:40:16.153549 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 17:40:16.153556 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 17:40:16.153563 systemd[1]: Stopped systemd-fsck-usr.service. May 27 17:40:16.153570 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:40:16.153577 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:40:16.153584 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:40:16.153590 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:40:16.153597 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 17:40:16.153605 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 17:40:16.153612 kernel: fuse: init (API version 7.41) May 27 17:40:16.153618 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:40:16.153625 systemd[1]: verity-setup.service: Deactivated successfully. May 27 17:40:16.153632 systemd[1]: Stopped verity-setup.service. May 27 17:40:16.153639 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:40:16.153646 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 17:40:16.153653 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 17:40:16.153661 systemd[1]: Mounted media.mount - External Media Directory. May 27 17:40:16.153668 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 17:40:16.153675 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 17:40:16.153682 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 17:40:16.153688 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:40:16.153695 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 17:40:16.153702 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 17:40:16.153709 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:40:16.153715 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:40:16.153723 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 17:40:16.153730 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:40:16.153737 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:40:16.153744 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 17:40:16.153750 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 17:40:16.153757 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:40:16.153764 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:40:16.153771 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 17:40:16.153778 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:40:16.153785 kernel: loop: module loaded May 27 17:40:16.153807 systemd-journald[1232]: Collecting audit messages is disabled. May 27 17:40:16.153826 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 17:40:16.153833 kernel: ACPI: bus type drm_connector registered May 27 17:40:16.153840 systemd-journald[1232]: Journal started May 27 17:40:16.153861 systemd-journald[1232]: Runtime Journal (/run/log/journal/cd9c8113f4e04bf0a142d70406bfbdf5) is 4.8M, max 38.8M, 34M free. May 27 17:40:15.947853 systemd[1]: Queued start job for default target multi-user.target. May 27 17:40:15.960856 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 27 17:40:15.961114 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 17:40:16.158532 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 17:40:16.158553 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 17:40:16.158564 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:40:16.158574 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 17:40:16.158638 jq[1205]: true May 27 17:40:16.159174 jq[1250]: true May 27 17:40:16.167287 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 17:40:16.169291 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:40:16.185974 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 17:40:16.186003 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:40:16.187286 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 17:40:16.197166 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:40:16.201295 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 17:40:16.201326 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 17:40:16.201340 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:40:16.203204 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:40:16.205301 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:40:16.205558 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:40:16.209400 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:40:16.209682 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 17:40:16.210662 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 17:40:16.211384 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 17:40:16.211616 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 17:40:16.220784 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:40:16.226890 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 17:40:16.229410 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 17:40:16.233817 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 17:40:16.234243 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:40:16.235288 kernel: loop0: detected capacity change from 0 to 146240 May 27 17:40:16.255504 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:40:16.268473 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 17:40:16.271433 systemd-journald[1232]: Time spent on flushing to /var/log/journal/cd9c8113f4e04bf0a142d70406bfbdf5 is 23.651ms for 1763 entries. May 27 17:40:16.271433 systemd-journald[1232]: System Journal (/var/log/journal/cd9c8113f4e04bf0a142d70406bfbdf5) is 8M, max 584.8M, 576.8M free. May 27 17:40:16.298366 systemd-journald[1232]: Received client request to flush runtime journal. May 27 17:40:16.274615 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 17:40:16.282863 ignition[1261]: Ignition 2.21.0 May 27 17:40:16.275714 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:40:16.283029 ignition[1261]: deleting config from guestinfo properties May 27 17:40:16.298892 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). May 27 17:40:16.291853 ignition[1261]: Successfully deleted config May 27 17:40:16.301385 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 17:40:16.314344 kernel: loop1: detected capacity change from 0 to 2960 May 27 17:40:16.318233 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 17:40:16.321947 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. May 27 17:40:16.321957 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. May 27 17:40:16.325036 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:40:16.339298 kernel: loop2: detected capacity change from 0 to 113872 May 27 17:40:16.372295 kernel: loop3: detected capacity change from 0 to 221472 May 27 17:40:16.405073 kernel: loop4: detected capacity change from 0 to 146240 May 27 17:40:16.443442 kernel: loop5: detected capacity change from 0 to 2960 May 27 17:40:16.458294 kernel: loop6: detected capacity change from 0 to 113872 May 27 17:40:16.486320 kernel: loop7: detected capacity change from 0 to 221472 May 27 17:40:16.511520 (sd-merge)[1309]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. May 27 17:40:16.512070 (sd-merge)[1309]: Merged extensions into '/usr'. May 27 17:40:16.519370 systemd[1]: Reload requested from client PID 1258 ('systemd-sysext') (unit systemd-sysext.service)... May 27 17:40:16.519385 systemd[1]: Reloading... May 27 17:40:16.564294 zram_generator::config[1333]: No configuration found. May 27 17:40:16.682069 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:40:16.705324 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 27 17:40:16.764909 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 17:40:16.765155 systemd[1]: Reloading finished in 245 ms. May 27 17:40:16.796246 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 17:40:16.802921 systemd[1]: Starting ensure-sysext.service... May 27 17:40:16.806464 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:40:16.815482 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 17:40:16.819445 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:40:16.824407 systemd[1]: Reload requested from client PID 1391 ('systemctl') (unit ensure-sysext.service)... May 27 17:40:16.824417 systemd[1]: Reloading... May 27 17:40:16.826232 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 17:40:16.826252 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 17:40:16.826598 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 17:40:16.828260 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 17:40:16.828837 systemd-tmpfiles[1392]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 17:40:16.829088 systemd-tmpfiles[1392]: ACLs are not supported, ignoring. May 27 17:40:16.829179 systemd-tmpfiles[1392]: ACLs are not supported, ignoring. May 27 17:40:16.831791 systemd-tmpfiles[1392]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:40:16.831842 systemd-tmpfiles[1392]: Skipping /boot May 27 17:40:16.838093 systemd-tmpfiles[1392]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:40:16.838143 systemd-tmpfiles[1392]: Skipping /boot May 27 17:40:16.870289 ldconfig[1254]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 17:40:16.871014 systemd-udevd[1395]: Using default interface naming scheme 'v255'. May 27 17:40:16.876287 zram_generator::config[1417]: No configuration found. May 27 17:40:17.015955 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:40:17.024800 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 27 17:40:17.034294 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 May 27 17:40:17.048325 kernel: mousedev: PS/2 mouse device common for all mice May 27 17:40:17.052420 kernel: ACPI: button: Power Button [PWRF] May 27 17:40:17.076322 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 27 17:40:17.076481 systemd[1]: Reloading finished in 251 ms. May 27 17:40:17.081963 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:40:17.082360 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 17:40:17.087738 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:40:17.094412 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:40:17.099431 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 17:40:17.100816 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 17:40:17.108511 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:40:17.115398 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:40:17.117357 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 17:40:17.126873 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 17:40:17.130180 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:40:17.134958 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:40:17.142194 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:40:17.143062 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:40:17.143371 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:40:17.143434 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:40:17.143492 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:40:17.151488 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:40:17.151605 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:40:17.151692 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:40:17.151777 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:40:17.154606 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:40:17.156258 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:40:17.156440 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:40:17.156506 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:40:17.156594 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:40:17.158896 systemd[1]: Finished ensure-sysext.service. May 27 17:40:17.159186 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 17:40:17.159959 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 17:40:17.168412 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 27 17:40:17.171390 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 17:40:17.177073 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:40:17.177204 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:40:17.177470 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:40:17.177588 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:40:17.178627 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:40:17.189786 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 17:40:17.194321 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 17:40:17.200396 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:40:17.200524 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:40:17.200724 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:40:17.200906 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:40:17.201006 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:40:17.209164 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 17:40:17.209776 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 17:40:17.220853 augenrules[1561]: No rules May 27 17:40:17.221263 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:40:17.225288 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! May 27 17:40:17.228375 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:40:17.238819 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. May 27 17:40:17.243584 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 17:40:17.268574 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 17:40:17.299793 systemd-networkd[1518]: lo: Link UP May 27 17:40:17.299799 systemd-networkd[1518]: lo: Gained carrier May 27 17:40:17.300392 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 27 17:40:17.300580 systemd[1]: Reached target time-set.target - System Time Set. May 27 17:40:17.302595 systemd-networkd[1518]: Enumeration completed May 27 17:40:17.302606 systemd-timesyncd[1539]: No network connectivity, watching for changes. May 27 17:40:17.302632 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:40:17.305376 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 17:40:17.306140 systemd-networkd[1518]: ens192: Configuring with /etc/systemd/network/00-vmware.network. May 27 17:40:17.308291 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated May 27 17:40:17.308420 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps May 27 17:40:17.306504 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 17:40:17.310550 systemd-networkd[1518]: ens192: Link UP May 27 17:40:17.310639 systemd-networkd[1518]: ens192: Gained carrier May 27 17:40:17.318338 systemd-timesyncd[1539]: Network configuration changed, trying to establish connection. May 27 17:40:17.325892 systemd-resolved[1519]: Positive Trust Anchors: May 27 17:40:17.327923 systemd-resolved[1519]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:40:17.327987 systemd-resolved[1519]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:40:17.331042 systemd-resolved[1519]: Defaulting to hostname 'linux'. May 27 17:40:17.334360 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:40:17.334523 systemd[1]: Reached target network.target - Network. May 27 17:40:17.334620 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:40:17.334738 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:40:17.334885 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 17:40:17.335007 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 17:40:17.335117 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 27 17:40:17.335303 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 17:40:17.335450 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 17:40:17.335558 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 17:40:17.335664 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 17:40:17.335682 systemd[1]: Reached target paths.target - Path Units. May 27 17:40:17.335770 systemd[1]: Reached target timers.target - Timer Units. May 27 17:40:17.337382 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 17:40:17.338406 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 17:40:17.341087 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 17:40:17.341290 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 17:40:17.341407 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 17:40:17.348481 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 17:40:17.348775 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 17:40:17.350379 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 17:40:17.350564 (udev-worker)[1435]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. May 27 17:40:17.350565 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 17:40:17.351253 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:40:17.351362 systemd[1]: Reached target basic.target - Basic System. May 27 17:40:17.351487 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 17:40:17.351503 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 17:40:17.353429 systemd[1]: Starting containerd.service - containerd container runtime... May 27 17:40:17.354737 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 17:40:17.355448 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 17:40:17.356137 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 17:40:17.359396 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 17:40:17.359510 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 17:40:17.360887 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 27 17:40:17.363340 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 17:40:17.370881 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 17:40:17.373416 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 17:40:17.378271 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Refreshing passwd entry cache May 27 17:40:17.378930 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 17:40:17.380569 jq[1598]: false May 27 17:40:17.380731 oslogin_cache_refresh[1600]: Refreshing passwd entry cache May 27 17:40:17.381610 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 17:40:17.382185 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 17:40:17.385561 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 17:40:17.389758 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Failure getting users, quitting May 27 17:40:17.389758 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 17:40:17.389758 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Refreshing group entry cache May 27 17:40:17.389480 oslogin_cache_refresh[1600]: Failure getting users, quitting May 27 17:40:17.389489 oslogin_cache_refresh[1600]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 17:40:17.389515 oslogin_cache_refresh[1600]: Refreshing group entry cache May 27 17:40:17.391020 systemd[1]: Starting update-engine.service - Update Engine... May 27 17:40:17.393383 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Failure getting groups, quitting May 27 17:40:17.393383 google_oslogin_nss_cache[1600]: oslogin_cache_refresh[1600]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 17:40:17.393351 oslogin_cache_refresh[1600]: Failure getting groups, quitting May 27 17:40:17.393357 oslogin_cache_refresh[1600]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 17:40:17.394320 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 17:40:17.397454 extend-filesystems[1599]: Found loop4 May 27 17:40:17.398617 extend-filesystems[1599]: Found loop5 May 27 17:40:17.398617 extend-filesystems[1599]: Found loop6 May 27 17:40:17.398617 extend-filesystems[1599]: Found loop7 May 27 17:40:17.398617 extend-filesystems[1599]: Found sda May 27 17:40:17.398617 extend-filesystems[1599]: Found sda1 May 27 17:40:17.398617 extend-filesystems[1599]: Found sda2 May 27 17:40:17.398617 extend-filesystems[1599]: Found sda3 May 27 17:40:17.398617 extend-filesystems[1599]: Found usr May 27 17:40:17.398617 extend-filesystems[1599]: Found sda4 May 27 17:40:17.398617 extend-filesystems[1599]: Found sda6 May 27 17:40:17.398617 extend-filesystems[1599]: Found sda7 May 27 17:40:17.398617 extend-filesystems[1599]: Found sda9 May 27 17:40:17.398617 extend-filesystems[1599]: Checking size of /dev/sda9 May 27 17:40:17.398748 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... May 27 17:40:17.404646 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 17:40:17.405466 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 17:40:17.405588 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 17:40:17.405739 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 27 17:40:17.405843 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 27 17:40:17.409373 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 17:40:17.410317 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 17:40:17.414300 jq[1610]: true May 27 17:40:17.426509 update_engine[1609]: I20250527 17:40:17.424526 1609 main.cc:92] Flatcar Update Engine starting May 27 17:40:17.427920 extend-filesystems[1599]: Old size kept for /dev/sda9 May 27 17:40:17.427920 extend-filesystems[1599]: Found sr0 May 27 17:40:17.428958 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 17:40:17.433392 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 17:40:17.435349 jq[1628]: true May 27 17:40:17.435915 (ntainerd)[1632]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 17:40:17.453637 tar[1620]: linux-amd64/helm May 27 17:40:17.459117 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:40:17.459629 systemd[1]: motdgen.service: Deactivated successfully. May 27 17:40:17.460686 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 17:40:17.468027 dbus-daemon[1596]: [system] SELinux support is enabled May 27 17:40:17.468252 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 17:40:17.469751 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 17:40:17.469765 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 17:40:17.471327 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 17:40:17.471341 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 17:40:17.488229 systemd[1]: Started update-engine.service - Update Engine. May 27 17:40:17.489740 update_engine[1609]: I20250527 17:40:17.489567 1609 update_check_scheduler.cc:74] Next update check in 10m52s May 27 17:40:17.493680 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 17:41:39.842101 systemd-resolved[1519]: Clock change detected. Flushing caches. May 27 17:41:39.842180 systemd-timesyncd[1539]: Contacted time server 66.118.231.14:123 (0.flatcar.pool.ntp.org). May 27 17:41:39.842209 systemd-timesyncd[1539]: Initial clock synchronization to Tue 2025-05-27 17:41:39.841974 UTC. May 27 17:41:39.867709 bash[1657]: Updated "/home/core/.ssh/authorized_keys" May 27 17:41:39.870648 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 17:41:39.871044 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 27 17:41:39.934876 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. May 27 17:41:39.940049 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... May 27 17:41:39.973986 locksmithd[1652]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 17:41:40.005037 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. May 27 17:41:40.008714 unknown[1672]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath May 27 17:41:40.009242 unknown[1672]: Core dump limit set to -1 May 27 17:41:40.017741 containerd[1632]: time="2025-05-27T17:41:40Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 17:41:40.018133 containerd[1632]: time="2025-05-27T17:41:40.018111161Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027420945Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.695µs" May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027445463Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027461942Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027598717Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027613841Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027641159Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027695175Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027702724Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027835651Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027846554Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027852311Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:41:40.028360 containerd[1632]: time="2025-05-27T17:41:40.027856537Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 17:41:40.028552 containerd[1632]: time="2025-05-27T17:41:40.027896859Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 17:41:40.028552 containerd[1632]: time="2025-05-27T17:41:40.028024460Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:41:40.028552 containerd[1632]: time="2025-05-27T17:41:40.028040552Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:41:40.028552 containerd[1632]: time="2025-05-27T17:41:40.028046262Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 17:41:40.028552 containerd[1632]: time="2025-05-27T17:41:40.028073164Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 17:41:40.028552 containerd[1632]: time="2025-05-27T17:41:40.028230794Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 17:41:40.028552 containerd[1632]: time="2025-05-27T17:41:40.028265535Z" level=info msg="metadata content store policy set" policy=shared May 27 17:41:40.048949 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:41:40.063768 systemd-logind[1607]: Watching system buttons on /dev/input/event2 (Power Button) May 27 17:41:40.063823 systemd-logind[1607]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 27 17:41:40.064145 systemd-logind[1607]: New seat seat0. May 27 17:41:40.065541 systemd[1]: Started systemd-logind.service - User Login Management. May 27 17:41:40.074415 containerd[1632]: time="2025-05-27T17:41:40.074388722Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 17:41:40.074481 containerd[1632]: time="2025-05-27T17:41:40.074428712Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 17:41:40.074481 containerd[1632]: time="2025-05-27T17:41:40.074439468Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 17:41:40.074481 containerd[1632]: time="2025-05-27T17:41:40.074446854Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 17:41:40.074481 containerd[1632]: time="2025-05-27T17:41:40.074454595Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 17:41:40.074481 containerd[1632]: time="2025-05-27T17:41:40.074460500Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 17:41:40.074481 containerd[1632]: time="2025-05-27T17:41:40.074467013Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 17:41:40.074481 containerd[1632]: time="2025-05-27T17:41:40.074473633Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 17:41:40.074481 containerd[1632]: time="2025-05-27T17:41:40.074480379Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 17:41:40.074605 containerd[1632]: time="2025-05-27T17:41:40.074487881Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 17:41:40.074605 containerd[1632]: time="2025-05-27T17:41:40.074493581Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 17:41:40.074605 containerd[1632]: time="2025-05-27T17:41:40.074500976Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 17:41:40.074605 containerd[1632]: time="2025-05-27T17:41:40.074576946Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 17:41:40.074605 containerd[1632]: time="2025-05-27T17:41:40.074589223Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 17:41:40.074605 containerd[1632]: time="2025-05-27T17:41:40.074598126Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 17:41:40.074605 containerd[1632]: time="2025-05-27T17:41:40.074604056Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074610470Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074627504Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074637749Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074643666Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074650213Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074659524Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074665715Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074706341Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 17:41:40.074718 containerd[1632]: time="2025-05-27T17:41:40.074714771Z" level=info msg="Start snapshots syncer" May 27 17:41:40.075031 containerd[1632]: time="2025-05-27T17:41:40.074727540Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 17:41:40.075031 containerd[1632]: time="2025-05-27T17:41:40.074889353Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 17:41:40.075197 containerd[1632]: time="2025-05-27T17:41:40.074915705Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 17:41:40.075371 containerd[1632]: time="2025-05-27T17:41:40.075347125Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 17:41:40.075435 containerd[1632]: time="2025-05-27T17:41:40.075423479Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075439293Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075446051Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075451760Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075458364Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075464348Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075471092Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075484305Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075490918Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075497737Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075521579Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075531385Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075536303Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:41:40.075542 containerd[1632]: time="2025-05-27T17:41:40.075542225Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:41:40.075877 containerd[1632]: time="2025-05-27T17:41:40.075546865Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 17:41:40.075877 containerd[1632]: time="2025-05-27T17:41:40.075552579Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 17:41:40.075877 containerd[1632]: time="2025-05-27T17:41:40.075560036Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 17:41:40.075877 containerd[1632]: time="2025-05-27T17:41:40.075589700Z" level=info msg="runtime interface created" May 27 17:41:40.075877 containerd[1632]: time="2025-05-27T17:41:40.075594428Z" level=info msg="created NRI interface" May 27 17:41:40.075877 containerd[1632]: time="2025-05-27T17:41:40.075599682Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 17:41:40.075877 containerd[1632]: time="2025-05-27T17:41:40.075606006Z" level=info msg="Connect containerd service" May 27 17:41:40.075877 containerd[1632]: time="2025-05-27T17:41:40.075619597Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 17:41:40.079407 containerd[1632]: time="2025-05-27T17:41:40.079387328Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 17:41:40.282482 containerd[1632]: time="2025-05-27T17:41:40.282425485Z" level=info msg="Start subscribing containerd event" May 27 17:41:40.282482 containerd[1632]: time="2025-05-27T17:41:40.282456725Z" level=info msg="Start recovering state" May 27 17:41:40.282555 containerd[1632]: time="2025-05-27T17:41:40.282518120Z" level=info msg="Start event monitor" May 27 17:41:40.282555 containerd[1632]: time="2025-05-27T17:41:40.282526456Z" level=info msg="Start cni network conf syncer for default" May 27 17:41:40.282555 containerd[1632]: time="2025-05-27T17:41:40.282530670Z" level=info msg="Start streaming server" May 27 17:41:40.282555 containerd[1632]: time="2025-05-27T17:41:40.282537897Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 17:41:40.282555 containerd[1632]: time="2025-05-27T17:41:40.282541884Z" level=info msg="runtime interface starting up..." May 27 17:41:40.282555 containerd[1632]: time="2025-05-27T17:41:40.282544643Z" level=info msg="starting plugins..." May 27 17:41:40.282555 containerd[1632]: time="2025-05-27T17:41:40.282552075Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 17:41:40.283065 containerd[1632]: time="2025-05-27T17:41:40.282699674Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 17:41:40.283065 containerd[1632]: time="2025-05-27T17:41:40.282731930Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 17:41:40.283065 containerd[1632]: time="2025-05-27T17:41:40.282805698Z" level=info msg="containerd successfully booted in 0.265272s" May 27 17:41:40.282869 systemd[1]: Started containerd.service - containerd container runtime. May 27 17:41:40.368454 tar[1620]: linux-amd64/LICENSE May 27 17:41:40.368523 tar[1620]: linux-amd64/README.md May 27 17:41:40.379046 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 17:41:40.429888 sshd_keygen[1630]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 17:41:40.443153 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 17:41:40.444474 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 17:41:40.452722 systemd[1]: issuegen.service: Deactivated successfully. May 27 17:41:40.452874 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 17:41:40.454640 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 17:41:40.479302 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 17:41:40.480466 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 17:41:40.482130 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 27 17:41:40.482933 systemd[1]: Reached target getty.target - Login Prompts. May 27 17:41:41.368125 systemd-networkd[1518]: ens192: Gained IPv6LL May 27 17:41:41.369771 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 17:41:41.370609 systemd[1]: Reached target network-online.target - Network is Online. May 27 17:41:41.371975 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... May 27 17:41:41.374133 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:41:41.378813 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 17:41:41.403951 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 17:41:41.410034 systemd[1]: coreos-metadata.service: Deactivated successfully. May 27 17:41:41.410179 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. May 27 17:41:41.411350 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 17:41:42.402233 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 17:41:42.403772 systemd[1]: Started sshd@0-139.178.70.105:22-196.251.114.29:51824.service - OpenSSH per-connection server daemon (196.251.114.29:51824). May 27 17:41:42.578439 sshd[1792]: Connection closed by 196.251.114.29 port 51824 May 27 17:41:42.579252 systemd[1]: sshd@0-139.178.70.105:22-196.251.114.29:51824.service: Deactivated successfully. May 27 17:41:42.582943 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:41:42.583358 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 17:41:42.583782 systemd[1]: Startup finished in 2.743s (kernel) + 5.746s (initrd) + 4.946s (userspace) = 13.435s. May 27 17:41:42.589349 (kubelet)[1800]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:41:42.617111 login[1763]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 27 17:41:42.618071 login[1764]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 27 17:41:42.622658 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 17:41:42.623520 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 17:41:42.629351 systemd-logind[1607]: New session 2 of user core. May 27 17:41:42.633338 systemd-logind[1607]: New session 1 of user core. May 27 17:41:42.641886 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 17:41:42.644613 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 17:41:42.654704 (systemd)[1807]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 17:41:42.657418 systemd-logind[1607]: New session c1 of user core. May 27 17:41:42.749648 systemd[1807]: Queued start job for default target default.target. May 27 17:41:42.754031 systemd[1807]: Created slice app.slice - User Application Slice. May 27 17:41:42.754053 systemd[1807]: Reached target paths.target - Paths. May 27 17:41:42.754085 systemd[1807]: Reached target timers.target - Timers. May 27 17:41:42.757050 systemd[1807]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 17:41:42.763303 systemd[1807]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 17:41:42.763341 systemd[1807]: Reached target sockets.target - Sockets. May 27 17:41:42.763496 systemd[1807]: Reached target basic.target - Basic System. May 27 17:41:42.763544 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 17:41:42.764051 systemd[1807]: Reached target default.target - Main User Target. May 27 17:41:42.764075 systemd[1807]: Startup finished in 102ms. May 27 17:41:42.768149 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 17:41:42.768709 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 17:41:43.047465 kubelet[1800]: E0527 17:41:43.047390 1800 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:41:43.048897 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:41:43.048988 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:41:43.049204 systemd[1]: kubelet.service: Consumed 637ms CPU time, 263.3M memory peak. May 27 17:41:53.264682 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 17:41:53.265909 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:41:53.700776 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:41:53.704236 (kubelet)[1850]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:41:53.815932 kubelet[1850]: E0527 17:41:53.815898 1850 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:41:53.818670 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:41:53.818763 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:41:53.818958 systemd[1]: kubelet.service: Consumed 112ms CPU time, 110.1M memory peak. May 27 17:42:04.014786 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 17:42:04.016288 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:42:04.160536 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:42:04.162970 (kubelet)[1865]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:42:04.209467 kubelet[1865]: E0527 17:42:04.209431 1865 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:42:04.211018 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:42:04.211195 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:42:04.211571 systemd[1]: kubelet.service: Consumed 87ms CPU time, 108M memory peak. May 27 17:42:10.123694 systemd[1]: Started sshd@1-139.178.70.105:22-139.178.89.65:58526.service - OpenSSH per-connection server daemon (139.178.89.65:58526). May 27 17:42:10.167091 sshd[1873]: Accepted publickey for core from 139.178.89.65 port 58526 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:42:10.168076 sshd-session[1873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:42:10.170847 systemd-logind[1607]: New session 3 of user core. May 27 17:42:10.182142 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 17:42:10.239225 systemd[1]: Started sshd@2-139.178.70.105:22-139.178.89.65:58532.service - OpenSSH per-connection server daemon (139.178.89.65:58532). May 27 17:42:10.275633 sshd[1878]: Accepted publickey for core from 139.178.89.65 port 58532 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:42:10.276556 sshd-session[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:42:10.280536 systemd-logind[1607]: New session 4 of user core. May 27 17:42:10.286109 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 17:42:10.334751 sshd[1880]: Connection closed by 139.178.89.65 port 58532 May 27 17:42:10.334418 sshd-session[1878]: pam_unix(sshd:session): session closed for user core May 27 17:42:10.344855 systemd[1]: sshd@2-139.178.70.105:22-139.178.89.65:58532.service: Deactivated successfully. May 27 17:42:10.345965 systemd[1]: session-4.scope: Deactivated successfully. May 27 17:42:10.346560 systemd-logind[1607]: Session 4 logged out. Waiting for processes to exit. May 27 17:42:10.349163 systemd[1]: Started sshd@3-139.178.70.105:22-139.178.89.65:58536.service - OpenSSH per-connection server daemon (139.178.89.65:58536). May 27 17:42:10.349829 systemd-logind[1607]: Removed session 4. May 27 17:42:10.381281 sshd[1886]: Accepted publickey for core from 139.178.89.65 port 58536 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:42:10.381949 sshd-session[1886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:42:10.384487 systemd-logind[1607]: New session 5 of user core. May 27 17:42:10.391080 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 17:42:10.437408 sshd[1888]: Connection closed by 139.178.89.65 port 58536 May 27 17:42:10.438212 sshd-session[1886]: pam_unix(sshd:session): session closed for user core May 27 17:42:10.446719 systemd[1]: sshd@3-139.178.70.105:22-139.178.89.65:58536.service: Deactivated successfully. May 27 17:42:10.447722 systemd[1]: session-5.scope: Deactivated successfully. May 27 17:42:10.448214 systemd-logind[1607]: Session 5 logged out. Waiting for processes to exit. May 27 17:42:10.449500 systemd[1]: Started sshd@4-139.178.70.105:22-139.178.89.65:58548.service - OpenSSH per-connection server daemon (139.178.89.65:58548). May 27 17:42:10.450453 systemd-logind[1607]: Removed session 5. May 27 17:42:10.488271 sshd[1894]: Accepted publickey for core from 139.178.89.65 port 58548 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:42:10.489077 sshd-session[1894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:42:10.492204 systemd-logind[1607]: New session 6 of user core. May 27 17:42:10.499112 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 17:42:10.548445 sshd[1896]: Connection closed by 139.178.89.65 port 58548 May 27 17:42:10.548792 sshd-session[1894]: pam_unix(sshd:session): session closed for user core May 27 17:42:10.555794 systemd[1]: sshd@4-139.178.70.105:22-139.178.89.65:58548.service: Deactivated successfully. May 27 17:42:10.556668 systemd[1]: session-6.scope: Deactivated successfully. May 27 17:42:10.557068 systemd-logind[1607]: Session 6 logged out. Waiting for processes to exit. May 27 17:42:10.558300 systemd[1]: Started sshd@5-139.178.70.105:22-139.178.89.65:58560.service - OpenSSH per-connection server daemon (139.178.89.65:58560). May 27 17:42:10.561084 systemd-logind[1607]: Removed session 6. May 27 17:42:10.591608 sshd[1902]: Accepted publickey for core from 139.178.89.65 port 58560 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:42:10.592364 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:42:10.595540 systemd-logind[1607]: New session 7 of user core. May 27 17:42:10.601084 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 17:42:10.659826 sudo[1905]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 17:42:10.659991 sudo[1905]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:42:10.669598 sudo[1905]: pam_unix(sudo:session): session closed for user root May 27 17:42:10.670591 sshd[1904]: Connection closed by 139.178.89.65 port 58560 May 27 17:42:10.670982 sshd-session[1902]: pam_unix(sshd:session): session closed for user core May 27 17:42:10.677317 systemd[1]: sshd@5-139.178.70.105:22-139.178.89.65:58560.service: Deactivated successfully. May 27 17:42:10.678542 systemd[1]: session-7.scope: Deactivated successfully. May 27 17:42:10.679131 systemd-logind[1607]: Session 7 logged out. Waiting for processes to exit. May 27 17:42:10.681258 systemd[1]: Started sshd@6-139.178.70.105:22-139.178.89.65:58576.service - OpenSSH per-connection server daemon (139.178.89.65:58576). May 27 17:42:10.682056 systemd-logind[1607]: Removed session 7. May 27 17:42:10.720808 sshd[1911]: Accepted publickey for core from 139.178.89.65 port 58576 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:42:10.721595 sshd-session[1911]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:42:10.724199 systemd-logind[1607]: New session 8 of user core. May 27 17:42:10.729223 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 17:42:10.777353 sudo[1915]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 17:42:10.777711 sudo[1915]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:42:10.780503 sudo[1915]: pam_unix(sudo:session): session closed for user root May 27 17:42:10.783494 sudo[1914]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 17:42:10.783649 sudo[1914]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:42:10.789495 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:42:10.814822 augenrules[1937]: No rules May 27 17:42:10.815645 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:42:10.815785 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:42:10.816354 sudo[1914]: pam_unix(sudo:session): session closed for user root May 27 17:42:10.817931 sshd[1913]: Connection closed by 139.178.89.65 port 58576 May 27 17:42:10.817283 sshd-session[1911]: pam_unix(sshd:session): session closed for user core May 27 17:42:10.826031 systemd[1]: sshd@6-139.178.70.105:22-139.178.89.65:58576.service: Deactivated successfully. May 27 17:42:10.826900 systemd[1]: session-8.scope: Deactivated successfully. May 27 17:42:10.827331 systemd-logind[1607]: Session 8 logged out. Waiting for processes to exit. May 27 17:42:10.828733 systemd[1]: Started sshd@7-139.178.70.105:22-139.178.89.65:58590.service - OpenSSH per-connection server daemon (139.178.89.65:58590). May 27 17:42:10.830242 systemd-logind[1607]: Removed session 8. May 27 17:42:10.863453 sshd[1946]: Accepted publickey for core from 139.178.89.65 port 58590 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:42:10.864180 sshd-session[1946]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:42:10.866712 systemd-logind[1607]: New session 9 of user core. May 27 17:42:10.876093 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 17:42:10.923209 sudo[1949]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 17:42:10.923719 sudo[1949]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:42:11.272892 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 17:42:11.282325 (dockerd)[1967]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 17:42:11.490791 dockerd[1967]: time="2025-05-27T17:42:11.490751503Z" level=info msg="Starting up" May 27 17:42:11.492165 dockerd[1967]: time="2025-05-27T17:42:11.492110128Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 17:42:11.547681 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport4060480889-merged.mount: Deactivated successfully. May 27 17:42:11.610763 systemd[1]: var-lib-docker-metacopy\x2dcheck3653721108-merged.mount: Deactivated successfully. May 27 17:42:11.638796 dockerd[1967]: time="2025-05-27T17:42:11.638625000Z" level=info msg="Loading containers: start." May 27 17:42:11.711008 kernel: Initializing XFRM netlink socket May 27 17:42:12.196485 systemd-networkd[1518]: docker0: Link UP May 27 17:42:12.199576 dockerd[1967]: time="2025-05-27T17:42:12.199550200Z" level=info msg="Loading containers: done." May 27 17:42:12.208541 dockerd[1967]: time="2025-05-27T17:42:12.208300114Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 17:42:12.208541 dockerd[1967]: time="2025-05-27T17:42:12.208360457Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 17:42:12.208541 dockerd[1967]: time="2025-05-27T17:42:12.208428505Z" level=info msg="Initializing buildkit" May 27 17:42:12.218977 dockerd[1967]: time="2025-05-27T17:42:12.218947355Z" level=info msg="Completed buildkit initialization" May 27 17:42:12.224785 dockerd[1967]: time="2025-05-27T17:42:12.224671292Z" level=info msg="Daemon has completed initialization" May 27 17:42:12.224856 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 17:42:12.225032 dockerd[1967]: time="2025-05-27T17:42:12.224935280Z" level=info msg="API listen on /run/docker.sock" May 27 17:42:12.545058 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck582801084-merged.mount: Deactivated successfully. May 27 17:42:14.004623 containerd[1632]: time="2025-05-27T17:42:14.004599380Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.9\"" May 27 17:42:14.264560 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 27 17:42:14.265647 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:42:14.484055 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:42:14.489155 (kubelet)[2177]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:42:14.516238 kubelet[2177]: E0527 17:42:14.516168 2177 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:42:14.517919 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:42:14.518040 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:42:14.518255 systemd[1]: kubelet.service: Consumed 93ms CPU time, 107.5M memory peak. May 27 17:42:15.274198 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3317053359.mount: Deactivated successfully. May 27 17:42:16.214020 containerd[1632]: time="2025-05-27T17:42:16.213698661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:16.215745 containerd[1632]: time="2025-05-27T17:42:16.215729050Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.9: active requests=0, bytes read=28078845" May 27 17:42:16.219687 containerd[1632]: time="2025-05-27T17:42:16.219670678Z" level=info msg="ImageCreate event name:\"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:16.224421 containerd[1632]: time="2025-05-27T17:42:16.224400627Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5b68f0df22013422dc8fb9ddfcff513eb6fc92f9dbf8aae41555c895efef5a20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:16.224762 containerd[1632]: time="2025-05-27T17:42:16.224676539Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.9\" with image id \"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5b68f0df22013422dc8fb9ddfcff513eb6fc92f9dbf8aae41555c895efef5a20\", size \"28075645\" in 2.220053093s" May 27 17:42:16.224762 containerd[1632]: time="2025-05-27T17:42:16.224695287Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.9\" returns image reference \"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\"" May 27 17:42:16.225039 containerd[1632]: time="2025-05-27T17:42:16.225025533Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.9\"" May 27 17:42:17.640633 containerd[1632]: time="2025-05-27T17:42:17.640606705Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:17.648538 containerd[1632]: time="2025-05-27T17:42:17.648507544Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.9: active requests=0, bytes read=24713522" May 27 17:42:17.660629 containerd[1632]: time="2025-05-27T17:42:17.660590364Z" level=info msg="ImageCreate event name:\"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:17.669013 containerd[1632]: time="2025-05-27T17:42:17.668534384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:be9e7987d323b38a12e28436cff6d6ec6fc31ffdd3ea11eaa9d74852e9d31248\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:17.669122 containerd[1632]: time="2025-05-27T17:42:17.669108951Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.9\" with image id \"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:be9e7987d323b38a12e28436cff6d6ec6fc31ffdd3ea11eaa9d74852e9d31248\", size \"26315362\" in 1.444067675s" May 27 17:42:17.669175 containerd[1632]: time="2025-05-27T17:42:17.669166322Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.9\" returns image reference \"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\"" May 27 17:42:17.669463 containerd[1632]: time="2025-05-27T17:42:17.669445238Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.9\"" May 27 17:42:19.176589 containerd[1632]: time="2025-05-27T17:42:19.176552764Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:19.181413 containerd[1632]: time="2025-05-27T17:42:19.181391841Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.9: active requests=0, bytes read=18784311" May 27 17:42:19.187418 containerd[1632]: time="2025-05-27T17:42:19.187384518Z" level=info msg="ImageCreate event name:\"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:19.196645 containerd[1632]: time="2025-05-27T17:42:19.196598497Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:eb358c7346bb17ab2c639c3ff8ab76a147dec7ae609f5c0c2800233e42253ed1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:19.197308 containerd[1632]: time="2025-05-27T17:42:19.197191101Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.9\" with image id \"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:eb358c7346bb17ab2c639c3ff8ab76a147dec7ae609f5c0c2800233e42253ed1\", size \"20386169\" in 1.527728253s" May 27 17:42:19.197308 containerd[1632]: time="2025-05-27T17:42:19.197214834Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.9\" returns image reference \"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\"" May 27 17:42:19.197770 containerd[1632]: time="2025-05-27T17:42:19.197553863Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.9\"" May 27 17:42:20.138043 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3451087983.mount: Deactivated successfully. May 27 17:42:20.524296 containerd[1632]: time="2025-05-27T17:42:20.524262079Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:20.528965 containerd[1632]: time="2025-05-27T17:42:20.528932182Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.9: active requests=0, bytes read=30355623" May 27 17:42:20.535224 containerd[1632]: time="2025-05-27T17:42:20.535186533Z" level=info msg="ImageCreate event name:\"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:20.544827 containerd[1632]: time="2025-05-27T17:42:20.544771985Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:fdf026cf2434537e499e9c739d189ca8fc57101d929ac5ccd8e24f979a9738c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:20.546184 containerd[1632]: time="2025-05-27T17:42:20.545681202Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.9\" with image id \"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\", repo tag \"registry.k8s.io/kube-proxy:v1.31.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:fdf026cf2434537e499e9c739d189ca8fc57101d929ac5ccd8e24f979a9738c1\", size \"30354642\" in 1.348107327s" May 27 17:42:20.546184 containerd[1632]: time="2025-05-27T17:42:20.545704918Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.9\" returns image reference \"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\"" May 27 17:42:20.547662 containerd[1632]: time="2025-05-27T17:42:20.547648048Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 27 17:42:21.819267 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount162063152.mount: Deactivated successfully. May 27 17:42:22.726017 containerd[1632]: time="2025-05-27T17:42:22.725960532Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:22.734046 containerd[1632]: time="2025-05-27T17:42:22.734020448Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" May 27 17:42:22.742645 containerd[1632]: time="2025-05-27T17:42:22.742600154Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:22.748145 containerd[1632]: time="2025-05-27T17:42:22.748114421Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:22.750013 containerd[1632]: time="2025-05-27T17:42:22.749381907Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.201663942s" May 27 17:42:22.750013 containerd[1632]: time="2025-05-27T17:42:22.749438744Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 27 17:42:22.750013 containerd[1632]: time="2025-05-27T17:42:22.749919133Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 17:42:23.453397 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3495214554.mount: Deactivated successfully. May 27 17:42:23.461141 containerd[1632]: time="2025-05-27T17:42:23.460736441Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:42:23.461378 containerd[1632]: time="2025-05-27T17:42:23.461366261Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 27 17:42:23.461963 containerd[1632]: time="2025-05-27T17:42:23.461950326Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:42:23.463232 containerd[1632]: time="2025-05-27T17:42:23.463213602Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:42:23.463682 containerd[1632]: time="2025-05-27T17:42:23.463519354Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 713.583114ms" May 27 17:42:23.463733 containerd[1632]: time="2025-05-27T17:42:23.463725387Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 27 17:42:23.464263 containerd[1632]: time="2025-05-27T17:42:23.464252843Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 27 17:42:24.395642 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1763419103.mount: Deactivated successfully. May 27 17:42:24.764621 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 27 17:42:24.765727 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:42:25.152843 update_engine[1609]: I20250527 17:42:25.152393 1609 update_attempter.cc:509] Updating boot flags... May 27 17:42:25.578879 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:42:25.585501 (kubelet)[2345]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:42:25.773923 kubelet[2345]: E0527 17:42:25.773816 2345 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:42:25.775330 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:42:25.775528 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:42:25.775877 systemd[1]: kubelet.service: Consumed 112ms CPU time, 107M memory peak. May 27 17:42:30.506377 containerd[1632]: time="2025-05-27T17:42:30.506325533Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:30.507702 containerd[1632]: time="2025-05-27T17:42:30.507396373Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780013" May 27 17:42:30.508107 containerd[1632]: time="2025-05-27T17:42:30.508089162Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:30.510678 containerd[1632]: time="2025-05-27T17:42:30.510660716Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:30.511458 containerd[1632]: time="2025-05-27T17:42:30.511412774Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 7.047102504s" May 27 17:42:30.511458 containerd[1632]: time="2025-05-27T17:42:30.511432132Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 27 17:42:32.536782 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:42:32.537225 systemd[1]: kubelet.service: Consumed 112ms CPU time, 107M memory peak. May 27 17:42:32.538947 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:42:32.557772 systemd[1]: Reload requested from client PID 2426 ('systemctl') (unit session-9.scope)... May 27 17:42:32.557785 systemd[1]: Reloading... May 27 17:42:32.624048 zram_generator::config[2468]: No configuration found. May 27 17:42:32.687492 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:42:32.696439 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 27 17:42:32.764807 systemd[1]: Reloading finished in 206 ms. May 27 17:42:32.794524 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 17:42:32.794585 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 17:42:32.794764 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:42:32.796120 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:42:33.185739 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:42:33.188726 (kubelet)[2536]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:42:33.228005 kubelet[2536]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:42:33.228005 kubelet[2536]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 27 17:42:33.228005 kubelet[2536]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:42:33.228005 kubelet[2536]: I0527 17:42:33.227778 2536 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:42:33.621481 kubelet[2536]: I0527 17:42:33.621415 2536 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 27 17:42:33.621481 kubelet[2536]: I0527 17:42:33.621476 2536 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:42:33.621732 kubelet[2536]: I0527 17:42:33.621722 2536 server.go:934] "Client rotation is on, will bootstrap in background" May 27 17:42:33.654253 kubelet[2536]: I0527 17:42:33.654218 2536 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:42:33.655041 kubelet[2536]: E0527 17:42:33.655024 2536 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.105:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" May 27 17:42:33.664121 kubelet[2536]: I0527 17:42:33.664103 2536 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:42:33.668929 kubelet[2536]: I0527 17:42:33.668912 2536 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:42:33.671524 kubelet[2536]: I0527 17:42:33.671505 2536 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 27 17:42:33.671633 kubelet[2536]: I0527 17:42:33.671608 2536 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:42:33.671749 kubelet[2536]: I0527 17:42:33.671632 2536 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:42:33.673335 kubelet[2536]: I0527 17:42:33.673320 2536 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:42:33.673335 kubelet[2536]: I0527 17:42:33.673335 2536 container_manager_linux.go:300] "Creating device plugin manager" May 27 17:42:33.673882 kubelet[2536]: I0527 17:42:33.673862 2536 state_mem.go:36] "Initialized new in-memory state store" May 27 17:42:33.676074 kubelet[2536]: I0527 17:42:33.676042 2536 kubelet.go:408] "Attempting to sync node with API server" May 27 17:42:33.676074 kubelet[2536]: I0527 17:42:33.676073 2536 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:42:33.678610 kubelet[2536]: I0527 17:42:33.678594 2536 kubelet.go:314] "Adding apiserver pod source" May 27 17:42:33.678652 kubelet[2536]: I0527 17:42:33.678615 2536 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:42:33.682966 kubelet[2536]: W0527 17:42:33.682776 2536 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.105:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused May 27 17:42:33.682966 kubelet[2536]: E0527 17:42:33.682814 2536 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.105:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" May 27 17:42:33.682966 kubelet[2536]: I0527 17:42:33.682867 2536 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:42:33.685184 kubelet[2536]: I0527 17:42:33.685174 2536 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 17:42:33.685824 kubelet[2536]: W0527 17:42:33.685815 2536 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 17:42:33.686760 kubelet[2536]: W0527 17:42:33.686671 2536 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused May 27 17:42:33.686760 kubelet[2536]: E0527 17:42:33.686704 2536 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" May 27 17:42:33.688254 kubelet[2536]: I0527 17:42:33.688134 2536 server.go:1274] "Started kubelet" May 27 17:42:33.689412 kubelet[2536]: I0527 17:42:33.689070 2536 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:42:33.690491 kubelet[2536]: I0527 17:42:33.690346 2536 server.go:449] "Adding debug handlers to kubelet server" May 27 17:42:33.691603 kubelet[2536]: I0527 17:42:33.691305 2536 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:42:33.691603 kubelet[2536]: I0527 17:42:33.691466 2536 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:42:33.696548 kubelet[2536]: E0527 17:42:33.691552 2536 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.105:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.105:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.184373356f53f965 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-27 17:42:33.688119653 +0000 UTC m=+0.497245700,LastTimestamp:2025-05-27 17:42:33.688119653 +0000 UTC m=+0.497245700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 27 17:42:33.696789 kubelet[2536]: I0527 17:42:33.696778 2536 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:42:33.697204 kubelet[2536]: I0527 17:42:33.696920 2536 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:42:33.700422 kubelet[2536]: E0527 17:42:33.699789 2536 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 27 17:42:33.700504 kubelet[2536]: I0527 17:42:33.700497 2536 volume_manager.go:289] "Starting Kubelet Volume Manager" May 27 17:42:33.704191 kubelet[2536]: I0527 17:42:33.703708 2536 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 27 17:42:33.704191 kubelet[2536]: I0527 17:42:33.703757 2536 reconciler.go:26] "Reconciler: start to sync state" May 27 17:42:33.704472 kubelet[2536]: W0527 17:42:33.704449 2536 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused May 27 17:42:33.704531 kubelet[2536]: E0527 17:42:33.704519 2536 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" May 27 17:42:33.704604 kubelet[2536]: E0527 17:42:33.704591 2536 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.105:6443: connect: connection refused" interval="200ms" May 27 17:42:33.705779 kubelet[2536]: I0527 17:42:33.705731 2536 factory.go:221] Registration of the systemd container factory successfully May 27 17:42:33.705989 kubelet[2536]: I0527 17:42:33.705977 2536 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:42:33.710042 kubelet[2536]: I0527 17:42:33.709624 2536 factory.go:221] Registration of the containerd container factory successfully May 27 17:42:33.712863 kubelet[2536]: I0527 17:42:33.712834 2536 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 17:42:33.713584 kubelet[2536]: I0527 17:42:33.713569 2536 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 17:42:33.713616 kubelet[2536]: I0527 17:42:33.713588 2536 status_manager.go:217] "Starting to sync pod status with apiserver" May 27 17:42:33.713616 kubelet[2536]: I0527 17:42:33.713599 2536 kubelet.go:2321] "Starting kubelet main sync loop" May 27 17:42:33.713652 kubelet[2536]: E0527 17:42:33.713622 2536 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:42:33.718110 kubelet[2536]: W0527 17:42:33.718076 2536 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused May 27 17:42:33.718172 kubelet[2536]: E0527 17:42:33.718115 2536 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" May 27 17:42:33.733725 kubelet[2536]: E0527 17:42:33.733660 2536 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 17:42:33.739164 kubelet[2536]: I0527 17:42:33.739144 2536 cpu_manager.go:214] "Starting CPU manager" policy="none" May 27 17:42:33.739239 kubelet[2536]: I0527 17:42:33.739174 2536 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 27 17:42:33.739239 kubelet[2536]: I0527 17:42:33.739185 2536 state_mem.go:36] "Initialized new in-memory state store" May 27 17:42:33.740091 kubelet[2536]: I0527 17:42:33.740080 2536 policy_none.go:49] "None policy: Start" May 27 17:42:33.740474 kubelet[2536]: I0527 17:42:33.740452 2536 memory_manager.go:170] "Starting memorymanager" policy="None" May 27 17:42:33.740474 kubelet[2536]: I0527 17:42:33.740464 2536 state_mem.go:35] "Initializing new in-memory state store" May 27 17:42:33.747265 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 17:42:33.755890 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 17:42:33.758530 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 17:42:33.765926 kubelet[2536]: I0527 17:42:33.765627 2536 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 17:42:33.765926 kubelet[2536]: I0527 17:42:33.765750 2536 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:42:33.765926 kubelet[2536]: I0527 17:42:33.765756 2536 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:42:33.765926 kubelet[2536]: I0527 17:42:33.765904 2536 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:42:33.767433 kubelet[2536]: E0527 17:42:33.767421 2536 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 27 17:42:33.822554 systemd[1]: Created slice kubepods-burstable-pod776be1490f009f56eb85e8be626957f5.slice - libcontainer container kubepods-burstable-pod776be1490f009f56eb85e8be626957f5.slice. May 27 17:42:33.838842 systemd[1]: Created slice kubepods-burstable-poda3416600bab1918b24583836301c9096.slice - libcontainer container kubepods-burstable-poda3416600bab1918b24583836301c9096.slice. May 27 17:42:33.841671 systemd[1]: Created slice kubepods-burstable-podea5884ad3481d5218ff4c8f11f2934d5.slice - libcontainer container kubepods-burstable-podea5884ad3481d5218ff4c8f11f2934d5.slice. May 27 17:42:33.866788 kubelet[2536]: I0527 17:42:33.866760 2536 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 27 17:42:33.867040 kubelet[2536]: E0527 17:42:33.867025 2536 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.105:6443/api/v1/nodes\": dial tcp 139.178.70.105:6443: connect: connection refused" node="localhost" May 27 17:42:33.906499 kubelet[2536]: I0527 17:42:33.905547 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/776be1490f009f56eb85e8be626957f5-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"776be1490f009f56eb85e8be626957f5\") " pod="kube-system/kube-apiserver-localhost" May 27 17:42:33.906499 kubelet[2536]: I0527 17:42:33.905577 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:33.906499 kubelet[2536]: I0527 17:42:33.905591 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:33.906499 kubelet[2536]: I0527 17:42:33.905603 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ea5884ad3481d5218ff4c8f11f2934d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"ea5884ad3481d5218ff4c8f11f2934d5\") " pod="kube-system/kube-scheduler-localhost" May 27 17:42:33.906499 kubelet[2536]: I0527 17:42:33.905612 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/776be1490f009f56eb85e8be626957f5-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"776be1490f009f56eb85e8be626957f5\") " pod="kube-system/kube-apiserver-localhost" May 27 17:42:33.906636 kubelet[2536]: I0527 17:42:33.905621 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:33.906636 kubelet[2536]: I0527 17:42:33.905629 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:33.906636 kubelet[2536]: I0527 17:42:33.905638 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:33.906636 kubelet[2536]: I0527 17:42:33.905648 2536 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/776be1490f009f56eb85e8be626957f5-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"776be1490f009f56eb85e8be626957f5\") " pod="kube-system/kube-apiserver-localhost" May 27 17:42:33.906636 kubelet[2536]: E0527 17:42:33.905709 2536 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.105:6443: connect: connection refused" interval="400ms" May 27 17:42:34.068303 kubelet[2536]: I0527 17:42:34.068279 2536 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 27 17:42:34.068515 kubelet[2536]: E0527 17:42:34.068498 2536 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.105:6443/api/v1/nodes\": dial tcp 139.178.70.105:6443: connect: connection refused" node="localhost" May 27 17:42:34.138203 containerd[1632]: time="2025-05-27T17:42:34.138154783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:776be1490f009f56eb85e8be626957f5,Namespace:kube-system,Attempt:0,}" May 27 17:42:34.145812 containerd[1632]: time="2025-05-27T17:42:34.145781717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:ea5884ad3481d5218ff4c8f11f2934d5,Namespace:kube-system,Attempt:0,}" May 27 17:42:34.146073 containerd[1632]: time="2025-05-27T17:42:34.146051053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a3416600bab1918b24583836301c9096,Namespace:kube-system,Attempt:0,}" May 27 17:42:34.306851 kubelet[2536]: E0527 17:42:34.306811 2536 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.105:6443: connect: connection refused" interval="800ms" May 27 17:42:34.336602 containerd[1632]: time="2025-05-27T17:42:34.336030200Z" level=info msg="connecting to shim 79a86c6cfc8fcab42aaf41dcb8bbac40cacb89ed6566f76c05a5b7ca5b08efd9" address="unix:///run/containerd/s/732ab7f11ae8b0310639285dae7e0cc00ebb5784ef62333a0ba80466acfb3103" namespace=k8s.io protocol=ttrpc version=3 May 27 17:42:34.337085 containerd[1632]: time="2025-05-27T17:42:34.337068983Z" level=info msg="connecting to shim 35fa92323cbea9eaa4ae8652217aecadf9e477c533f0e4951b52994d93207a8b" address="unix:///run/containerd/s/18e5e4e788f9e1d3c3e51ebf05198cd225f5b4578aa0c865b931b2039ec7bd36" namespace=k8s.io protocol=ttrpc version=3 May 27 17:42:34.337633 containerd[1632]: time="2025-05-27T17:42:34.337618466Z" level=info msg="connecting to shim 34f49384befb37aaa174f3c0cb80e1af76e94cfd42e18cf06c757d6db3405ed2" address="unix:///run/containerd/s/3c2c5cef496a8f23ae89368d75fab086b01c307078da8782c683e286a635582e" namespace=k8s.io protocol=ttrpc version=3 May 27 17:42:34.463173 systemd[1]: Started cri-containerd-34f49384befb37aaa174f3c0cb80e1af76e94cfd42e18cf06c757d6db3405ed2.scope - libcontainer container 34f49384befb37aaa174f3c0cb80e1af76e94cfd42e18cf06c757d6db3405ed2. May 27 17:42:34.464665 systemd[1]: Started cri-containerd-35fa92323cbea9eaa4ae8652217aecadf9e477c533f0e4951b52994d93207a8b.scope - libcontainer container 35fa92323cbea9eaa4ae8652217aecadf9e477c533f0e4951b52994d93207a8b. May 27 17:42:34.465695 systemd[1]: Started cri-containerd-79a86c6cfc8fcab42aaf41dcb8bbac40cacb89ed6566f76c05a5b7ca5b08efd9.scope - libcontainer container 79a86c6cfc8fcab42aaf41dcb8bbac40cacb89ed6566f76c05a5b7ca5b08efd9. May 27 17:42:34.473717 kubelet[2536]: I0527 17:42:34.473135 2536 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 27 17:42:34.473717 kubelet[2536]: E0527 17:42:34.473537 2536 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.105:6443/api/v1/nodes\": dial tcp 139.178.70.105:6443: connect: connection refused" node="localhost" May 27 17:42:34.517089 containerd[1632]: time="2025-05-27T17:42:34.516965384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:776be1490f009f56eb85e8be626957f5,Namespace:kube-system,Attempt:0,} returns sandbox id \"35fa92323cbea9eaa4ae8652217aecadf9e477c533f0e4951b52994d93207a8b\"" May 27 17:42:34.524464 containerd[1632]: time="2025-05-27T17:42:34.524438678Z" level=info msg="CreateContainer within sandbox \"35fa92323cbea9eaa4ae8652217aecadf9e477c533f0e4951b52994d93207a8b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 17:42:34.535507 containerd[1632]: time="2025-05-27T17:42:34.535477658Z" level=info msg="Container a46bc1b6240dff45f862376b10de4db708d3659e06669ee48a2239c2a5155d00: CDI devices from CRI Config.CDIDevices: []" May 27 17:42:34.546807 containerd[1632]: time="2025-05-27T17:42:34.546780627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a3416600bab1918b24583836301c9096,Namespace:kube-system,Attempt:0,} returns sandbox id \"34f49384befb37aaa174f3c0cb80e1af76e94cfd42e18cf06c757d6db3405ed2\"" May 27 17:42:34.548119 containerd[1632]: time="2025-05-27T17:42:34.548094344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:ea5884ad3481d5218ff4c8f11f2934d5,Namespace:kube-system,Attempt:0,} returns sandbox id \"79a86c6cfc8fcab42aaf41dcb8bbac40cacb89ed6566f76c05a5b7ca5b08efd9\"" May 27 17:42:34.549414 containerd[1632]: time="2025-05-27T17:42:34.549384578Z" level=info msg="CreateContainer within sandbox \"79a86c6cfc8fcab42aaf41dcb8bbac40cacb89ed6566f76c05a5b7ca5b08efd9\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 17:42:34.550056 containerd[1632]: time="2025-05-27T17:42:34.549464452Z" level=info msg="CreateContainer within sandbox \"34f49384befb37aaa174f3c0cb80e1af76e94cfd42e18cf06c757d6db3405ed2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 17:42:34.551312 containerd[1632]: time="2025-05-27T17:42:34.551202479Z" level=info msg="CreateContainer within sandbox \"35fa92323cbea9eaa4ae8652217aecadf9e477c533f0e4951b52994d93207a8b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"a46bc1b6240dff45f862376b10de4db708d3659e06669ee48a2239c2a5155d00\"" May 27 17:42:34.552889 containerd[1632]: time="2025-05-27T17:42:34.552874188Z" level=info msg="StartContainer for \"a46bc1b6240dff45f862376b10de4db708d3659e06669ee48a2239c2a5155d00\"" May 27 17:42:34.556619 containerd[1632]: time="2025-05-27T17:42:34.556567581Z" level=info msg="connecting to shim a46bc1b6240dff45f862376b10de4db708d3659e06669ee48a2239c2a5155d00" address="unix:///run/containerd/s/18e5e4e788f9e1d3c3e51ebf05198cd225f5b4578aa0c865b931b2039ec7bd36" protocol=ttrpc version=3 May 27 17:42:34.557930 containerd[1632]: time="2025-05-27T17:42:34.557881576Z" level=info msg="Container 3828f85b97e1e8315f2bad2e359e7706e164514aa0e5caed4b7daabc29e894d9: CDI devices from CRI Config.CDIDevices: []" May 27 17:42:34.558312 containerd[1632]: time="2025-05-27T17:42:34.557888654Z" level=info msg="Container 04a271f07471b3f7eede96bf6957b5c760c0d077e59f934a79f2dc650f009274: CDI devices from CRI Config.CDIDevices: []" May 27 17:42:34.564275 containerd[1632]: time="2025-05-27T17:42:34.564246604Z" level=info msg="CreateContainer within sandbox \"34f49384befb37aaa174f3c0cb80e1af76e94cfd42e18cf06c757d6db3405ed2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"3828f85b97e1e8315f2bad2e359e7706e164514aa0e5caed4b7daabc29e894d9\"" May 27 17:42:34.564762 containerd[1632]: time="2025-05-27T17:42:34.564691215Z" level=info msg="CreateContainer within sandbox \"79a86c6cfc8fcab42aaf41dcb8bbac40cacb89ed6566f76c05a5b7ca5b08efd9\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"04a271f07471b3f7eede96bf6957b5c760c0d077e59f934a79f2dc650f009274\"" May 27 17:42:34.565399 containerd[1632]: time="2025-05-27T17:42:34.565376499Z" level=info msg="StartContainer for \"04a271f07471b3f7eede96bf6957b5c760c0d077e59f934a79f2dc650f009274\"" May 27 17:42:34.565613 containerd[1632]: time="2025-05-27T17:42:34.565549414Z" level=info msg="StartContainer for \"3828f85b97e1e8315f2bad2e359e7706e164514aa0e5caed4b7daabc29e894d9\"" May 27 17:42:34.566485 containerd[1632]: time="2025-05-27T17:42:34.566471855Z" level=info msg="connecting to shim 3828f85b97e1e8315f2bad2e359e7706e164514aa0e5caed4b7daabc29e894d9" address="unix:///run/containerd/s/3c2c5cef496a8f23ae89368d75fab086b01c307078da8782c683e286a635582e" protocol=ttrpc version=3 May 27 17:42:34.567108 containerd[1632]: time="2025-05-27T17:42:34.567091401Z" level=info msg="connecting to shim 04a271f07471b3f7eede96bf6957b5c760c0d077e59f934a79f2dc650f009274" address="unix:///run/containerd/s/732ab7f11ae8b0310639285dae7e0cc00ebb5784ef62333a0ba80466acfb3103" protocol=ttrpc version=3 May 27 17:42:34.579146 systemd[1]: Started cri-containerd-a46bc1b6240dff45f862376b10de4db708d3659e06669ee48a2239c2a5155d00.scope - libcontainer container a46bc1b6240dff45f862376b10de4db708d3659e06669ee48a2239c2a5155d00. May 27 17:42:34.587130 systemd[1]: Started cri-containerd-04a271f07471b3f7eede96bf6957b5c760c0d077e59f934a79f2dc650f009274.scope - libcontainer container 04a271f07471b3f7eede96bf6957b5c760c0d077e59f934a79f2dc650f009274. May 27 17:42:34.594144 systemd[1]: Started cri-containerd-3828f85b97e1e8315f2bad2e359e7706e164514aa0e5caed4b7daabc29e894d9.scope - libcontainer container 3828f85b97e1e8315f2bad2e359e7706e164514aa0e5caed4b7daabc29e894d9. May 27 17:42:34.641159 containerd[1632]: time="2025-05-27T17:42:34.641134905Z" level=info msg="StartContainer for \"a46bc1b6240dff45f862376b10de4db708d3659e06669ee48a2239c2a5155d00\" returns successfully" May 27 17:42:34.647205 containerd[1632]: time="2025-05-27T17:42:34.647164861Z" level=info msg="StartContainer for \"04a271f07471b3f7eede96bf6957b5c760c0d077e59f934a79f2dc650f009274\" returns successfully" May 27 17:42:34.658094 containerd[1632]: time="2025-05-27T17:42:34.658030745Z" level=info msg="StartContainer for \"3828f85b97e1e8315f2bad2e359e7706e164514aa0e5caed4b7daabc29e894d9\" returns successfully" May 27 17:42:34.661684 kubelet[2536]: W0527 17:42:34.661635 2536 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused May 27 17:42:34.661813 kubelet[2536]: E0527 17:42:34.661745 2536 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" May 27 17:42:34.798426 kubelet[2536]: W0527 17:42:34.798386 2536 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused May 27 17:42:34.798426 kubelet[2536]: E0527 17:42:34.798430 2536 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" May 27 17:42:34.967805 kubelet[2536]: W0527 17:42:34.966631 2536 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.105:6443: connect: connection refused May 27 17:42:34.967805 kubelet[2536]: E0527 17:42:34.966673 2536 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.105:6443: connect: connection refused" logger="UnhandledError" May 27 17:42:35.107908 kubelet[2536]: E0527 17:42:35.107879 2536 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.105:6443: connect: connection refused" interval="1.6s" May 27 17:42:35.275360 kubelet[2536]: I0527 17:42:35.275274 2536 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 27 17:42:35.997545 kubelet[2536]: I0527 17:42:35.997363 2536 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 27 17:42:36.687016 kubelet[2536]: I0527 17:42:36.686899 2536 apiserver.go:52] "Watching apiserver" May 27 17:42:36.704884 kubelet[2536]: I0527 17:42:36.704857 2536 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 27 17:42:36.748164 kubelet[2536]: E0527 17:42:36.748130 2536 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" May 27 17:42:37.912679 systemd[1]: Reload requested from client PID 2807 ('systemctl') (unit session-9.scope)... May 27 17:42:37.912876 systemd[1]: Reloading... May 27 17:42:37.958037 zram_generator::config[2853]: No configuration found. May 27 17:42:38.057516 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:42:38.068357 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 27 17:42:38.153854 systemd[1]: Reloading finished in 240 ms. May 27 17:42:38.173972 kubelet[2536]: I0527 17:42:38.173305 2536 dynamic_cafile_content.go:174] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:42:38.173395 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:42:38.185315 systemd[1]: kubelet.service: Deactivated successfully. May 27 17:42:38.185478 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:42:38.185515 systemd[1]: kubelet.service: Consumed 621ms CPU time, 127.1M memory peak. May 27 17:42:38.187290 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:42:38.822268 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:42:38.828307 (kubelet)[2918]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:42:38.891989 kubelet[2918]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:42:38.891989 kubelet[2918]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 27 17:42:38.891989 kubelet[2918]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:42:38.894040 kubelet[2918]: I0527 17:42:38.893342 2918 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:42:38.904963 kubelet[2918]: I0527 17:42:38.904930 2918 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 27 17:42:38.905696 kubelet[2918]: I0527 17:42:38.905108 2918 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:42:38.905696 kubelet[2918]: I0527 17:42:38.905505 2918 server.go:934] "Client rotation is on, will bootstrap in background" May 27 17:42:38.908472 kubelet[2918]: I0527 17:42:38.908444 2918 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 27 17:42:38.913791 kubelet[2918]: I0527 17:42:38.913771 2918 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:42:38.920458 kubelet[2918]: I0527 17:42:38.920439 2918 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:42:38.928123 kubelet[2918]: I0527 17:42:38.928099 2918 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:42:38.930677 kubelet[2918]: I0527 17:42:38.930361 2918 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 27 17:42:38.930677 kubelet[2918]: I0527 17:42:38.930568 2918 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:42:38.930819 kubelet[2918]: I0527 17:42:38.930595 2918 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:42:38.930819 kubelet[2918]: I0527 17:42:38.930742 2918 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:42:38.930819 kubelet[2918]: I0527 17:42:38.930752 2918 container_manager_linux.go:300] "Creating device plugin manager" May 27 17:42:38.930819 kubelet[2918]: I0527 17:42:38.930775 2918 state_mem.go:36] "Initialized new in-memory state store" May 27 17:42:38.931143 kubelet[2918]: I0527 17:42:38.930848 2918 kubelet.go:408] "Attempting to sync node with API server" May 27 17:42:38.931143 kubelet[2918]: I0527 17:42:38.930858 2918 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:42:38.931143 kubelet[2918]: I0527 17:42:38.930887 2918 kubelet.go:314] "Adding apiserver pod source" May 27 17:42:38.931143 kubelet[2918]: I0527 17:42:38.930899 2918 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:42:38.934175 kubelet[2918]: I0527 17:42:38.934151 2918 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:42:38.935439 kubelet[2918]: I0527 17:42:38.935420 2918 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 17:42:38.936178 kubelet[2918]: I0527 17:42:38.936165 2918 server.go:1274] "Started kubelet" May 27 17:42:38.950011 kubelet[2918]: I0527 17:42:38.947438 2918 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:42:38.950946 kubelet[2918]: I0527 17:42:38.950852 2918 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:42:38.957462 kubelet[2918]: I0527 17:42:38.956550 2918 server.go:449] "Adding debug handlers to kubelet server" May 27 17:42:38.959091 kubelet[2918]: I0527 17:42:38.958128 2918 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:42:38.960007 kubelet[2918]: I0527 17:42:38.958332 2918 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:42:38.965267 kubelet[2918]: I0527 17:42:38.965252 2918 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:42:38.965758 kubelet[2918]: I0527 17:42:38.965739 2918 volume_manager.go:289] "Starting Kubelet Volume Manager" May 27 17:42:38.970617 kubelet[2918]: I0527 17:42:38.970595 2918 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 27 17:42:38.970722 kubelet[2918]: I0527 17:42:38.970709 2918 reconciler.go:26] "Reconciler: start to sync state" May 27 17:42:38.974116 kubelet[2918]: E0527 17:42:38.974095 2918 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 17:42:38.974565 kubelet[2918]: I0527 17:42:38.974553 2918 factory.go:221] Registration of the containerd container factory successfully May 27 17:42:38.974622 kubelet[2918]: I0527 17:42:38.974617 2918 factory.go:221] Registration of the systemd container factory successfully May 27 17:42:38.974733 kubelet[2918]: I0527 17:42:38.974719 2918 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:42:38.978701 kubelet[2918]: I0527 17:42:38.978646 2918 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 17:42:38.980373 kubelet[2918]: I0527 17:42:38.980323 2918 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 17:42:38.980373 kubelet[2918]: I0527 17:42:38.980359 2918 status_manager.go:217] "Starting to sync pod status with apiserver" May 27 17:42:38.980373 kubelet[2918]: I0527 17:42:38.980390 2918 kubelet.go:2321] "Starting kubelet main sync loop" May 27 17:42:38.980601 kubelet[2918]: E0527 17:42:38.980432 2918 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:42:39.024651 kubelet[2918]: I0527 17:42:39.024611 2918 cpu_manager.go:214] "Starting CPU manager" policy="none" May 27 17:42:39.024651 kubelet[2918]: I0527 17:42:39.024623 2918 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 27 17:42:39.024651 kubelet[2918]: I0527 17:42:39.024637 2918 state_mem.go:36] "Initialized new in-memory state store" May 27 17:42:39.024855 kubelet[2918]: I0527 17:42:39.024778 2918 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 17:42:39.024855 kubelet[2918]: I0527 17:42:39.024785 2918 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 17:42:39.024855 kubelet[2918]: I0527 17:42:39.024798 2918 policy_none.go:49] "None policy: Start" May 27 17:42:39.025323 kubelet[2918]: I0527 17:42:39.025315 2918 memory_manager.go:170] "Starting memorymanager" policy="None" May 27 17:42:39.025391 kubelet[2918]: I0527 17:42:39.025385 2918 state_mem.go:35] "Initializing new in-memory state store" May 27 17:42:39.026016 kubelet[2918]: I0527 17:42:39.025937 2918 state_mem.go:75] "Updated machine memory state" May 27 17:42:39.034967 kubelet[2918]: I0527 17:42:39.034951 2918 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 17:42:39.035422 kubelet[2918]: I0527 17:42:39.035369 2918 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:42:39.035422 kubelet[2918]: I0527 17:42:39.035379 2918 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:42:39.035681 kubelet[2918]: I0527 17:42:39.035637 2918 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:42:39.141862 kubelet[2918]: I0527 17:42:39.141734 2918 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 27 17:42:39.151528 kubelet[2918]: I0527 17:42:39.151491 2918 kubelet_node_status.go:111] "Node was previously registered" node="localhost" May 27 17:42:39.151678 kubelet[2918]: I0527 17:42:39.151597 2918 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 27 17:42:39.280359 kubelet[2918]: I0527 17:42:39.280330 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:39.280359 kubelet[2918]: I0527 17:42:39.280355 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:39.280521 kubelet[2918]: I0527 17:42:39.280369 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:39.280521 kubelet[2918]: I0527 17:42:39.280382 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:39.280521 kubelet[2918]: I0527 17:42:39.280392 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 27 17:42:39.280521 kubelet[2918]: I0527 17:42:39.280404 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ea5884ad3481d5218ff4c8f11f2934d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"ea5884ad3481d5218ff4c8f11f2934d5\") " pod="kube-system/kube-scheduler-localhost" May 27 17:42:39.280521 kubelet[2918]: I0527 17:42:39.280412 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/776be1490f009f56eb85e8be626957f5-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"776be1490f009f56eb85e8be626957f5\") " pod="kube-system/kube-apiserver-localhost" May 27 17:42:39.280632 kubelet[2918]: I0527 17:42:39.280421 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/776be1490f009f56eb85e8be626957f5-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"776be1490f009f56eb85e8be626957f5\") " pod="kube-system/kube-apiserver-localhost" May 27 17:42:39.280632 kubelet[2918]: I0527 17:42:39.280431 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/776be1490f009f56eb85e8be626957f5-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"776be1490f009f56eb85e8be626957f5\") " pod="kube-system/kube-apiserver-localhost" May 27 17:42:39.941006 kubelet[2918]: I0527 17:42:39.940904 2918 apiserver.go:52] "Watching apiserver" May 27 17:42:39.970781 kubelet[2918]: I0527 17:42:39.970745 2918 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 27 17:42:40.014977 kubelet[2918]: E0527 17:42:40.014779 2918 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 27 17:42:40.054422 kubelet[2918]: I0527 17:42:40.054345 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.054332057 podStartE2EDuration="1.054332057s" podCreationTimestamp="2025-05-27 17:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:42:40.048510019 +0000 UTC m=+1.210001177" watchObservedRunningTime="2025-05-27 17:42:40.054332057 +0000 UTC m=+1.215823196" May 27 17:42:40.060503 kubelet[2918]: I0527 17:42:40.060380 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.060367423 podStartE2EDuration="1.060367423s" podCreationTimestamp="2025-05-27 17:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:42:40.054486112 +0000 UTC m=+1.215977249" watchObservedRunningTime="2025-05-27 17:42:40.060367423 +0000 UTC m=+1.221858569" May 27 17:42:40.070475 kubelet[2918]: I0527 17:42:40.070437 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.070424343 podStartE2EDuration="1.070424343s" podCreationTimestamp="2025-05-27 17:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:42:40.061056343 +0000 UTC m=+1.222547489" watchObservedRunningTime="2025-05-27 17:42:40.070424343 +0000 UTC m=+1.231915490" May 27 17:42:43.166424 kubelet[2918]: I0527 17:42:43.166397 2918 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 17:42:43.166851 containerd[1632]: time="2025-05-27T17:42:43.166782010Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 17:42:43.167183 kubelet[2918]: I0527 17:42:43.166887 2918 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 17:42:44.367504 systemd[1]: Created slice kubepods-besteffort-pod7bf84a01_646c_4854_aee5_3e671618d1a0.slice - libcontainer container kubepods-besteffort-pod7bf84a01_646c_4854_aee5_3e671618d1a0.slice. May 27 17:42:44.414697 kubelet[2918]: I0527 17:42:44.414666 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7bf84a01-646c-4854-aee5-3e671618d1a0-kube-proxy\") pod \"kube-proxy-9b7pf\" (UID: \"7bf84a01-646c-4854-aee5-3e671618d1a0\") " pod="kube-system/kube-proxy-9b7pf" May 27 17:42:44.414697 kubelet[2918]: I0527 17:42:44.414693 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7bf84a01-646c-4854-aee5-3e671618d1a0-xtables-lock\") pod \"kube-proxy-9b7pf\" (UID: \"7bf84a01-646c-4854-aee5-3e671618d1a0\") " pod="kube-system/kube-proxy-9b7pf" May 27 17:42:44.414697 kubelet[2918]: I0527 17:42:44.414706 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bf84a01-646c-4854-aee5-3e671618d1a0-lib-modules\") pod \"kube-proxy-9b7pf\" (UID: \"7bf84a01-646c-4854-aee5-3e671618d1a0\") " pod="kube-system/kube-proxy-9b7pf" May 27 17:42:44.415062 kubelet[2918]: I0527 17:42:44.414718 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkwt9\" (UniqueName: \"kubernetes.io/projected/7bf84a01-646c-4854-aee5-3e671618d1a0-kube-api-access-gkwt9\") pod \"kube-proxy-9b7pf\" (UID: \"7bf84a01-646c-4854-aee5-3e671618d1a0\") " pod="kube-system/kube-proxy-9b7pf" May 27 17:42:44.418751 kubelet[2918]: W0527 17:42:44.418722 2918 reflector.go:561] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object May 27 17:42:44.418846 kubelet[2918]: E0527 17:42:44.418755 2918 reflector.go:158] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'localhost' and this object" logger="UnhandledError" May 27 17:42:44.423903 systemd[1]: Created slice kubepods-besteffort-podae2e1877_49d5_4ad3_82c7_637789129927.slice - libcontainer container kubepods-besteffort-podae2e1877_49d5_4ad3_82c7_637789129927.slice. May 27 17:42:44.515809 kubelet[2918]: I0527 17:42:44.515592 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ae2e1877-49d5-4ad3-82c7-637789129927-var-lib-calico\") pod \"tigera-operator-7c5755cdcb-kjc7q\" (UID: \"ae2e1877-49d5-4ad3-82c7-637789129927\") " pod="tigera-operator/tigera-operator-7c5755cdcb-kjc7q" May 27 17:42:44.515809 kubelet[2918]: I0527 17:42:44.515644 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqgpk\" (UniqueName: \"kubernetes.io/projected/ae2e1877-49d5-4ad3-82c7-637789129927-kube-api-access-qqgpk\") pod \"tigera-operator-7c5755cdcb-kjc7q\" (UID: \"ae2e1877-49d5-4ad3-82c7-637789129927\") " pod="tigera-operator/tigera-operator-7c5755cdcb-kjc7q" May 27 17:42:44.678165 containerd[1632]: time="2025-05-27T17:42:44.677988704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9b7pf,Uid:7bf84a01-646c-4854-aee5-3e671618d1a0,Namespace:kube-system,Attempt:0,}" May 27 17:42:44.690115 containerd[1632]: time="2025-05-27T17:42:44.690088017Z" level=info msg="connecting to shim 22dce99c11d76c518f127e275d38085f7201043cb792f4a84e528aac1098ce5d" address="unix:///run/containerd/s/0ef5ddb3b6e9c2672d7a616c98aaddfaccfe82637edb3a15185735cfdd23e46c" namespace=k8s.io protocol=ttrpc version=3 May 27 17:42:44.714196 systemd[1]: Started cri-containerd-22dce99c11d76c518f127e275d38085f7201043cb792f4a84e528aac1098ce5d.scope - libcontainer container 22dce99c11d76c518f127e275d38085f7201043cb792f4a84e528aac1098ce5d. May 27 17:42:44.726703 containerd[1632]: time="2025-05-27T17:42:44.726647792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-kjc7q,Uid:ae2e1877-49d5-4ad3-82c7-637789129927,Namespace:tigera-operator,Attempt:0,}" May 27 17:42:44.741066 containerd[1632]: time="2025-05-27T17:42:44.741045167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9b7pf,Uid:7bf84a01-646c-4854-aee5-3e671618d1a0,Namespace:kube-system,Attempt:0,} returns sandbox id \"22dce99c11d76c518f127e275d38085f7201043cb792f4a84e528aac1098ce5d\"" May 27 17:42:44.743706 containerd[1632]: time="2025-05-27T17:42:44.743645400Z" level=info msg="CreateContainer within sandbox \"22dce99c11d76c518f127e275d38085f7201043cb792f4a84e528aac1098ce5d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 17:42:44.763832 containerd[1632]: time="2025-05-27T17:42:44.763806989Z" level=info msg="Container c39820f953014ae79ecab284fce9de240177f168bf140cd3eb18b904b7065e92: CDI devices from CRI Config.CDIDevices: []" May 27 17:42:44.769636 containerd[1632]: time="2025-05-27T17:42:44.769560192Z" level=info msg="CreateContainer within sandbox \"22dce99c11d76c518f127e275d38085f7201043cb792f4a84e528aac1098ce5d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c39820f953014ae79ecab284fce9de240177f168bf140cd3eb18b904b7065e92\"" May 27 17:42:44.770944 containerd[1632]: time="2025-05-27T17:42:44.770092911Z" level=info msg="connecting to shim f08e1522419a0ba6ef8d8fffcded3a81dba8718a24292d636544c1fd5fb50f9c" address="unix:///run/containerd/s/2dc24e3e1b3485331972d412eac76150c1c0afdd7f4ba275e8ffddf9e48ad73f" namespace=k8s.io protocol=ttrpc version=3 May 27 17:42:44.770944 containerd[1632]: time="2025-05-27T17:42:44.770325185Z" level=info msg="StartContainer for \"c39820f953014ae79ecab284fce9de240177f168bf140cd3eb18b904b7065e92\"" May 27 17:42:44.772550 containerd[1632]: time="2025-05-27T17:42:44.772518097Z" level=info msg="connecting to shim c39820f953014ae79ecab284fce9de240177f168bf140cd3eb18b904b7065e92" address="unix:///run/containerd/s/0ef5ddb3b6e9c2672d7a616c98aaddfaccfe82637edb3a15185735cfdd23e46c" protocol=ttrpc version=3 May 27 17:42:44.790149 systemd[1]: Started cri-containerd-f08e1522419a0ba6ef8d8fffcded3a81dba8718a24292d636544c1fd5fb50f9c.scope - libcontainer container f08e1522419a0ba6ef8d8fffcded3a81dba8718a24292d636544c1fd5fb50f9c. May 27 17:42:44.793055 systemd[1]: Started cri-containerd-c39820f953014ae79ecab284fce9de240177f168bf140cd3eb18b904b7065e92.scope - libcontainer container c39820f953014ae79ecab284fce9de240177f168bf140cd3eb18b904b7065e92. May 27 17:42:44.832087 containerd[1632]: time="2025-05-27T17:42:44.832035516Z" level=info msg="StartContainer for \"c39820f953014ae79ecab284fce9de240177f168bf140cd3eb18b904b7065e92\" returns successfully" May 27 17:42:44.838259 containerd[1632]: time="2025-05-27T17:42:44.838196033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-kjc7q,Uid:ae2e1877-49d5-4ad3-82c7-637789129927,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f08e1522419a0ba6ef8d8fffcded3a81dba8718a24292d636544c1fd5fb50f9c\"" May 27 17:42:44.839956 containerd[1632]: time="2025-05-27T17:42:44.839939897Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 17:42:45.536952 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3442495102.mount: Deactivated successfully. May 27 17:42:45.905570 kubelet[2918]: I0527 17:42:45.905431 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-9b7pf" podStartSLOduration=1.90541166 podStartE2EDuration="1.90541166s" podCreationTimestamp="2025-05-27 17:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:42:45.026236068 +0000 UTC m=+6.187727216" watchObservedRunningTime="2025-05-27 17:42:45.90541166 +0000 UTC m=+7.066902806" May 27 17:42:46.393007 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1488207478.mount: Deactivated successfully. May 27 17:42:46.951526 containerd[1632]: time="2025-05-27T17:42:46.951125970Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:46.955704 containerd[1632]: time="2025-05-27T17:42:46.955669390Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 27 17:42:46.962914 containerd[1632]: time="2025-05-27T17:42:46.962854874Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:46.968464 containerd[1632]: time="2025-05-27T17:42:46.968402153Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:46.968857 containerd[1632]: time="2025-05-27T17:42:46.968741064Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 2.128616257s" May 27 17:42:46.968857 containerd[1632]: time="2025-05-27T17:42:46.968763498Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 27 17:42:46.971136 containerd[1632]: time="2025-05-27T17:42:46.971110905Z" level=info msg="CreateContainer within sandbox \"f08e1522419a0ba6ef8d8fffcded3a81dba8718a24292d636544c1fd5fb50f9c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 17:42:46.978455 containerd[1632]: time="2025-05-27T17:42:46.978425564Z" level=info msg="Container ec84ecee7d19b80e87f5afb720dcc87a9b61cff6b3a29c2717cd5da283adc73a: CDI devices from CRI Config.CDIDevices: []" May 27 17:42:46.980583 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount518090589.mount: Deactivated successfully. May 27 17:42:46.992144 containerd[1632]: time="2025-05-27T17:42:46.992110395Z" level=info msg="CreateContainer within sandbox \"f08e1522419a0ba6ef8d8fffcded3a81dba8718a24292d636544c1fd5fb50f9c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ec84ecee7d19b80e87f5afb720dcc87a9b61cff6b3a29c2717cd5da283adc73a\"" May 27 17:42:46.992795 containerd[1632]: time="2025-05-27T17:42:46.992647311Z" level=info msg="StartContainer for \"ec84ecee7d19b80e87f5afb720dcc87a9b61cff6b3a29c2717cd5da283adc73a\"" May 27 17:42:46.994277 containerd[1632]: time="2025-05-27T17:42:46.994230159Z" level=info msg="connecting to shim ec84ecee7d19b80e87f5afb720dcc87a9b61cff6b3a29c2717cd5da283adc73a" address="unix:///run/containerd/s/2dc24e3e1b3485331972d412eac76150c1c0afdd7f4ba275e8ffddf9e48ad73f" protocol=ttrpc version=3 May 27 17:42:47.012149 systemd[1]: Started cri-containerd-ec84ecee7d19b80e87f5afb720dcc87a9b61cff6b3a29c2717cd5da283adc73a.scope - libcontainer container ec84ecee7d19b80e87f5afb720dcc87a9b61cff6b3a29c2717cd5da283adc73a. May 27 17:42:47.083691 containerd[1632]: time="2025-05-27T17:42:47.083661483Z" level=info msg="StartContainer for \"ec84ecee7d19b80e87f5afb720dcc87a9b61cff6b3a29c2717cd5da283adc73a\" returns successfully" May 27 17:42:49.853687 kubelet[2918]: I0527 17:42:49.853549 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7c5755cdcb-kjc7q" podStartSLOduration=3.71223464 podStartE2EDuration="5.842723947s" podCreationTimestamp="2025-05-27 17:42:44 +0000 UTC" firstStartedPulling="2025-05-27 17:42:44.838956548 +0000 UTC m=+6.000447684" lastFinishedPulling="2025-05-27 17:42:46.969445852 +0000 UTC m=+8.130936991" observedRunningTime="2025-05-27 17:42:48.057423313 +0000 UTC m=+9.218914452" watchObservedRunningTime="2025-05-27 17:42:49.842723947 +0000 UTC m=+11.004215086" May 27 17:42:53.295725 sudo[1949]: pam_unix(sudo:session): session closed for user root May 27 17:42:53.297356 sshd[1948]: Connection closed by 139.178.89.65 port 58590 May 27 17:42:53.299129 sshd-session[1946]: pam_unix(sshd:session): session closed for user core May 27 17:42:53.301744 systemd[1]: sshd@7-139.178.70.105:22-139.178.89.65:58590.service: Deactivated successfully. May 27 17:42:53.304305 systemd[1]: session-9.scope: Deactivated successfully. May 27 17:42:53.304524 systemd[1]: session-9.scope: Consumed 3.026s CPU time, 151M memory peak. May 27 17:42:53.307327 systemd-logind[1607]: Session 9 logged out. Waiting for processes to exit. May 27 17:42:53.309343 systemd-logind[1607]: Removed session 9. May 27 17:42:55.767620 systemd[1]: Created slice kubepods-besteffort-pod318fb5bf_351c_42bf_9a3e_07beb25cf430.slice - libcontainer container kubepods-besteffort-pod318fb5bf_351c_42bf_9a3e_07beb25cf430.slice. May 27 17:42:55.854451 kubelet[2918]: I0527 17:42:55.854402 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpt6j\" (UniqueName: \"kubernetes.io/projected/318fb5bf-351c-42bf-9a3e-07beb25cf430-kube-api-access-bpt6j\") pod \"calico-typha-9d48648b4-92fgx\" (UID: \"318fb5bf-351c-42bf-9a3e-07beb25cf430\") " pod="calico-system/calico-typha-9d48648b4-92fgx" May 27 17:42:55.854451 kubelet[2918]: I0527 17:42:55.854431 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/318fb5bf-351c-42bf-9a3e-07beb25cf430-tigera-ca-bundle\") pod \"calico-typha-9d48648b4-92fgx\" (UID: \"318fb5bf-351c-42bf-9a3e-07beb25cf430\") " pod="calico-system/calico-typha-9d48648b4-92fgx" May 27 17:42:55.854766 kubelet[2918]: I0527 17:42:55.854738 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/318fb5bf-351c-42bf-9a3e-07beb25cf430-typha-certs\") pod \"calico-typha-9d48648b4-92fgx\" (UID: \"318fb5bf-351c-42bf-9a3e-07beb25cf430\") " pod="calico-system/calico-typha-9d48648b4-92fgx" May 27 17:42:55.969943 systemd[1]: Created slice kubepods-besteffort-pod2a38b159_fbd6_4a56_ac7c_bd2f4520f955.slice - libcontainer container kubepods-besteffort-pod2a38b159_fbd6_4a56_ac7c_bd2f4520f955.slice. May 27 17:42:56.055886 kubelet[2918]: I0527 17:42:56.055625 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjdzw\" (UniqueName: \"kubernetes.io/projected/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-kube-api-access-bjdzw\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056212 kubelet[2918]: I0527 17:42:56.056197 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-flexvol-driver-host\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056313 kubelet[2918]: I0527 17:42:56.056296 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-var-lib-calico\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056387 kubelet[2918]: I0527 17:42:56.056378 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-xtables-lock\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056472 kubelet[2918]: I0527 17:42:56.056459 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-node-certs\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056713 kubelet[2918]: I0527 17:42:56.056540 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-var-run-calico\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056713 kubelet[2918]: I0527 17:42:56.056589 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-cni-bin-dir\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056713 kubelet[2918]: I0527 17:42:56.056605 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-policysync\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056713 kubelet[2918]: I0527 17:42:56.056619 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-cni-log-dir\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056713 kubelet[2918]: I0527 17:42:56.056632 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-cni-net-dir\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056858 kubelet[2918]: I0527 17:42:56.056649 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-tigera-ca-bundle\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.056858 kubelet[2918]: I0527 17:42:56.056673 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a38b159-fbd6-4a56-ac7c-bd2f4520f955-lib-modules\") pod \"calico-node-7g2n8\" (UID: \"2a38b159-fbd6-4a56-ac7c-bd2f4520f955\") " pod="calico-system/calico-node-7g2n8" May 27 17:42:56.099655 containerd[1632]: time="2025-05-27T17:42:56.099601044Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9d48648b4-92fgx,Uid:318fb5bf-351c-42bf-9a3e-07beb25cf430,Namespace:calico-system,Attempt:0,}" May 27 17:42:56.210838 kubelet[2918]: E0527 17:42:56.210735 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dhvg9" podUID="2e20f778-9dbd-4e58-98d4-243c667b05e3" May 27 17:42:56.236652 kubelet[2918]: E0527 17:42:56.236633 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.236652 kubelet[2918]: W0527 17:42:56.236648 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.243714 kubelet[2918]: E0527 17:42:56.243692 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.262397 containerd[1632]: time="2025-05-27T17:42:56.262119708Z" level=info msg="connecting to shim da515c7f4977208bc618b3dfeca01602555ba5f30b5d8350716cd76b55b24620" address="unix:///run/containerd/s/017a4afcaea941da2b497b3c612e904dcb0cc6f0afc41eb3f3c6bee4845686dc" namespace=k8s.io protocol=ttrpc version=3 May 27 17:42:56.275169 containerd[1632]: time="2025-05-27T17:42:56.274145479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7g2n8,Uid:2a38b159-fbd6-4a56-ac7c-bd2f4520f955,Namespace:calico-system,Attempt:0,}" May 27 17:42:56.281151 systemd[1]: Started cri-containerd-da515c7f4977208bc618b3dfeca01602555ba5f30b5d8350716cd76b55b24620.scope - libcontainer container da515c7f4977208bc618b3dfeca01602555ba5f30b5d8350716cd76b55b24620. May 27 17:42:56.292639 kubelet[2918]: E0527 17:42:56.292609 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.292639 kubelet[2918]: W0527 17:42:56.292634 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.292825 kubelet[2918]: E0527 17:42:56.292650 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.292825 kubelet[2918]: E0527 17:42:56.292801 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.292825 kubelet[2918]: W0527 17:42:56.292807 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.292893 kubelet[2918]: E0527 17:42:56.292826 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.292932 kubelet[2918]: E0527 17:42:56.292910 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.292932 kubelet[2918]: W0527 17:42:56.292916 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.292989 kubelet[2918]: E0527 17:42:56.292922 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.293105 kubelet[2918]: E0527 17:42:56.293081 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.293105 kubelet[2918]: W0527 17:42:56.293086 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.293105 kubelet[2918]: E0527 17:42:56.293091 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.294100 kubelet[2918]: E0527 17:42:56.294090 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.294186 kubelet[2918]: W0527 17:42:56.294145 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.294186 kubelet[2918]: E0527 17:42:56.294155 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.294338 kubelet[2918]: E0527 17:42:56.294332 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.294411 kubelet[2918]: W0527 17:42:56.294381 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.294411 kubelet[2918]: E0527 17:42:56.294391 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.294586 kubelet[2918]: E0527 17:42:56.294553 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.294586 kubelet[2918]: W0527 17:42:56.294559 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.294586 kubelet[2918]: E0527 17:42:56.294564 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.294773 kubelet[2918]: E0527 17:42:56.294754 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.294850 kubelet[2918]: W0527 17:42:56.294760 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.294850 kubelet[2918]: E0527 17:42:56.294820 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.295055 kubelet[2918]: E0527 17:42:56.295025 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.295055 kubelet[2918]: W0527 17:42:56.295031 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.295055 kubelet[2918]: E0527 17:42:56.295037 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.295227 kubelet[2918]: E0527 17:42:56.295221 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.295298 kubelet[2918]: W0527 17:42:56.295251 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.295298 kubelet[2918]: E0527 17:42:56.295259 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.295480 kubelet[2918]: E0527 17:42:56.295439 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.295480 kubelet[2918]: W0527 17:42:56.295445 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.295480 kubelet[2918]: E0527 17:42:56.295450 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.295660 kubelet[2918]: E0527 17:42:56.295626 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.295660 kubelet[2918]: W0527 17:42:56.295632 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.295660 kubelet[2918]: E0527 17:42:56.295637 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.295856 kubelet[2918]: E0527 17:42:56.295814 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.295856 kubelet[2918]: W0527 17:42:56.295820 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.295856 kubelet[2918]: E0527 17:42:56.295830 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.296043 kubelet[2918]: E0527 17:42:56.296014 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.296043 kubelet[2918]: W0527 17:42:56.296020 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.296043 kubelet[2918]: E0527 17:42:56.296026 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.296203 kubelet[2918]: E0527 17:42:56.296196 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.296272 kubelet[2918]: W0527 17:42:56.296224 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.296272 kubelet[2918]: E0527 17:42:56.296232 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.296435 kubelet[2918]: E0527 17:42:56.296399 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.296435 kubelet[2918]: W0527 17:42:56.296405 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.296435 kubelet[2918]: E0527 17:42:56.296410 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.296655 kubelet[2918]: E0527 17:42:56.296608 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.296655 kubelet[2918]: W0527 17:42:56.296614 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.296655 kubelet[2918]: E0527 17:42:56.296620 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.296776 kubelet[2918]: E0527 17:42:56.296769 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.296853 kubelet[2918]: W0527 17:42:56.296812 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.296853 kubelet[2918]: E0527 17:42:56.296820 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.297110 kubelet[2918]: E0527 17:42:56.297055 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.297110 kubelet[2918]: W0527 17:42:56.297060 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.297110 kubelet[2918]: E0527 17:42:56.297066 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.297293 kubelet[2918]: E0527 17:42:56.297249 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.297293 kubelet[2918]: W0527 17:42:56.297256 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.297293 kubelet[2918]: E0527 17:42:56.297262 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.349082 containerd[1632]: time="2025-05-27T17:42:56.348291491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9d48648b4-92fgx,Uid:318fb5bf-351c-42bf-9a3e-07beb25cf430,Namespace:calico-system,Attempt:0,} returns sandbox id \"da515c7f4977208bc618b3dfeca01602555ba5f30b5d8350716cd76b55b24620\"" May 27 17:42:56.350393 containerd[1632]: time="2025-05-27T17:42:56.350377686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 17:42:56.371424 kubelet[2918]: E0527 17:42:56.371403 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.371424 kubelet[2918]: W0527 17:42:56.371417 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.371583 kubelet[2918]: E0527 17:42:56.371430 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.371583 kubelet[2918]: I0527 17:42:56.371450 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2e20f778-9dbd-4e58-98d4-243c667b05e3-varrun\") pod \"csi-node-driver-dhvg9\" (UID: \"2e20f778-9dbd-4e58-98d4-243c667b05e3\") " pod="calico-system/csi-node-driver-dhvg9" May 27 17:42:56.371685 kubelet[2918]: E0527 17:42:56.371672 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.371685 kubelet[2918]: W0527 17:42:56.371681 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.371747 kubelet[2918]: E0527 17:42:56.371698 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.371848 kubelet[2918]: I0527 17:42:56.371824 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2e20f778-9dbd-4e58-98d4-243c667b05e3-registration-dir\") pod \"csi-node-driver-dhvg9\" (UID: \"2e20f778-9dbd-4e58-98d4-243c667b05e3\") " pod="calico-system/csi-node-driver-dhvg9" May 27 17:42:56.371900 kubelet[2918]: E0527 17:42:56.371853 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.371900 kubelet[2918]: W0527 17:42:56.371877 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.371900 kubelet[2918]: E0527 17:42:56.371896 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.372059 kubelet[2918]: E0527 17:42:56.372049 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.372059 kubelet[2918]: W0527 17:42:56.372056 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.372146 kubelet[2918]: E0527 17:42:56.372068 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.372185 kubelet[2918]: E0527 17:42:56.372173 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.372185 kubelet[2918]: W0527 17:42:56.372181 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.372185 kubelet[2918]: E0527 17:42:56.372188 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.372293 kubelet[2918]: E0527 17:42:56.372278 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.372293 kubelet[2918]: W0527 17:42:56.372283 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.372293 kubelet[2918]: E0527 17:42:56.372288 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.372390 kubelet[2918]: E0527 17:42:56.372370 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.372390 kubelet[2918]: W0527 17:42:56.372375 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.372390 kubelet[2918]: E0527 17:42:56.372380 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.372494 kubelet[2918]: I0527 17:42:56.372394 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qrmw\" (UniqueName: \"kubernetes.io/projected/2e20f778-9dbd-4e58-98d4-243c667b05e3-kube-api-access-8qrmw\") pod \"csi-node-driver-dhvg9\" (UID: \"2e20f778-9dbd-4e58-98d4-243c667b05e3\") " pod="calico-system/csi-node-driver-dhvg9" May 27 17:42:56.372532 kubelet[2918]: E0527 17:42:56.372499 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.372532 kubelet[2918]: W0527 17:42:56.372504 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.372532 kubelet[2918]: E0527 17:42:56.372511 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.372632 kubelet[2918]: E0527 17:42:56.372594 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.372632 kubelet[2918]: W0527 17:42:56.372598 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.372632 kubelet[2918]: E0527 17:42:56.372606 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.372733 kubelet[2918]: E0527 17:42:56.372700 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.372733 kubelet[2918]: W0527 17:42:56.372704 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.372733 kubelet[2918]: E0527 17:42:56.372709 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.372733 kubelet[2918]: I0527 17:42:56.372722 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2e20f778-9dbd-4e58-98d4-243c667b05e3-socket-dir\") pod \"csi-node-driver-dhvg9\" (UID: \"2e20f778-9dbd-4e58-98d4-243c667b05e3\") " pod="calico-system/csi-node-driver-dhvg9" May 27 17:42:56.372855 kubelet[2918]: E0527 17:42:56.372817 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.372855 kubelet[2918]: W0527 17:42:56.372824 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.372855 kubelet[2918]: E0527 17:42:56.372835 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.373032 kubelet[2918]: E0527 17:42:56.372922 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.373032 kubelet[2918]: W0527 17:42:56.372927 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.373032 kubelet[2918]: E0527 17:42:56.372934 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.373121 kubelet[2918]: E0527 17:42:56.373113 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.373165 kubelet[2918]: W0527 17:42:56.373152 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.373269 kubelet[2918]: E0527 17:42:56.373198 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.373269 kubelet[2918]: I0527 17:42:56.373213 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e20f778-9dbd-4e58-98d4-243c667b05e3-kubelet-dir\") pod \"csi-node-driver-dhvg9\" (UID: \"2e20f778-9dbd-4e58-98d4-243c667b05e3\") " pod="calico-system/csi-node-driver-dhvg9" May 27 17:42:56.373348 kubelet[2918]: E0527 17:42:56.373342 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.373382 kubelet[2918]: W0527 17:42:56.373376 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.373427 kubelet[2918]: E0527 17:42:56.373420 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.373556 kubelet[2918]: E0527 17:42:56.373534 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.373556 kubelet[2918]: W0527 17:42:56.373540 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.373556 kubelet[2918]: E0527 17:42:56.373547 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.430047 containerd[1632]: time="2025-05-27T17:42:56.429863610Z" level=info msg="connecting to shim bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47" address="unix:///run/containerd/s/95cd56afd76389d2d2c003dac5ec2d8c773c95176151d4af35f62fcc216637cb" namespace=k8s.io protocol=ttrpc version=3 May 27 17:42:56.451173 systemd[1]: Started cri-containerd-bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47.scope - libcontainer container bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47. May 27 17:42:56.474008 kubelet[2918]: E0527 17:42:56.473938 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.474008 kubelet[2918]: W0527 17:42:56.473953 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.474008 kubelet[2918]: E0527 17:42:56.473966 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.475132 kubelet[2918]: E0527 17:42:56.475112 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.475132 kubelet[2918]: W0527 17:42:56.475120 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.475268 kubelet[2918]: E0527 17:42:56.475244 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.475368 kubelet[2918]: E0527 17:42:56.475318 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.475368 kubelet[2918]: W0527 17:42:56.475324 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.475368 kubelet[2918]: E0527 17:42:56.475332 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.475504 kubelet[2918]: E0527 17:42:56.475470 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.475504 kubelet[2918]: W0527 17:42:56.475476 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.475504 kubelet[2918]: E0527 17:42:56.475484 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.475685 kubelet[2918]: E0527 17:42:56.475619 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.475685 kubelet[2918]: W0527 17:42:56.475624 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.475685 kubelet[2918]: E0527 17:42:56.475631 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.475858 kubelet[2918]: E0527 17:42:56.475809 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.475858 kubelet[2918]: W0527 17:42:56.475816 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.475858 kubelet[2918]: E0527 17:42:56.475827 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.476010 kubelet[2918]: E0527 17:42:56.475982 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.476084 kubelet[2918]: W0527 17:42:56.475991 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.476084 kubelet[2918]: E0527 17:42:56.476051 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.477079 kubelet[2918]: E0527 17:42:56.477062 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.477079 kubelet[2918]: W0527 17:42:56.477070 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.477196 kubelet[2918]: E0527 17:42:56.477140 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.477287 kubelet[2918]: E0527 17:42:56.477275 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.477343 kubelet[2918]: W0527 17:42:56.477320 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.477365 kubelet[2918]: E0527 17:42:56.477339 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.477471 kubelet[2918]: E0527 17:42:56.477445 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.477471 kubelet[2918]: W0527 17:42:56.477451 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.477471 kubelet[2918]: E0527 17:42:56.477464 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.477607 kubelet[2918]: E0527 17:42:56.477579 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.477607 kubelet[2918]: W0527 17:42:56.477585 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.477607 kubelet[2918]: E0527 17:42:56.477600 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.477759 kubelet[2918]: E0527 17:42:56.477724 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.477759 kubelet[2918]: W0527 17:42:56.477729 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.477759 kubelet[2918]: E0527 17:42:56.477737 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.477889 kubelet[2918]: E0527 17:42:56.477867 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.477889 kubelet[2918]: W0527 17:42:56.477872 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.477889 kubelet[2918]: E0527 17:42:56.477881 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.477991 kubelet[2918]: E0527 17:42:56.477980 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.477991 kubelet[2918]: W0527 17:42:56.477986 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.477991 kubelet[2918]: E0527 17:42:56.478002 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.478119 kubelet[2918]: E0527 17:42:56.478085 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.478119 kubelet[2918]: W0527 17:42:56.478090 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.478119 kubelet[2918]: E0527 17:42:56.478096 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.479054 kubelet[2918]: E0527 17:42:56.478171 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.479054 kubelet[2918]: W0527 17:42:56.478176 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.479054 kubelet[2918]: E0527 17:42:56.478183 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.479228 kubelet[2918]: E0527 17:42:56.479175 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.479228 kubelet[2918]: W0527 17:42:56.479187 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.479228 kubelet[2918]: E0527 17:42:56.479194 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.479333 kubelet[2918]: E0527 17:42:56.479327 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.479379 kubelet[2918]: W0527 17:42:56.479358 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.479476 kubelet[2918]: E0527 17:42:56.479462 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.479541 kubelet[2918]: E0527 17:42:56.479529 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.479541 kubelet[2918]: W0527 17:42:56.479535 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.479639 kubelet[2918]: E0527 17:42:56.479609 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.479715 kubelet[2918]: E0527 17:42:56.479709 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.479799 kubelet[2918]: W0527 17:42:56.479753 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.479843 kubelet[2918]: E0527 17:42:56.479837 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.479948 kubelet[2918]: E0527 17:42:56.479937 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.479948 kubelet[2918]: W0527 17:42:56.479942 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.480066 kubelet[2918]: E0527 17:42:56.480013 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.480250 kubelet[2918]: E0527 17:42:56.480237 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.480250 kubelet[2918]: W0527 17:42:56.480243 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.480466 kubelet[2918]: E0527 17:42:56.480385 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.480642 kubelet[2918]: E0527 17:42:56.480629 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.480642 kubelet[2918]: W0527 17:42:56.480635 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.481096 kubelet[2918]: E0527 17:42:56.481023 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.481272 kubelet[2918]: E0527 17:42:56.481186 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.481272 kubelet[2918]: W0527 17:42:56.481201 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.481272 kubelet[2918]: E0527 17:42:56.481207 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.487143 kubelet[2918]: E0527 17:42:56.487092 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.487143 kubelet[2918]: W0527 17:42:56.487104 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.487143 kubelet[2918]: E0527 17:42:56.487119 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.494922 kubelet[2918]: E0527 17:42:56.494861 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:56.494922 kubelet[2918]: W0527 17:42:56.494878 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:56.494922 kubelet[2918]: E0527 17:42:56.494895 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:56.542876 containerd[1632]: time="2025-05-27T17:42:56.542719212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7g2n8,Uid:2a38b159-fbd6-4a56-ac7c-bd2f4520f955,Namespace:calico-system,Attempt:0,} returns sandbox id \"bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47\"" May 27 17:42:57.981096 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4272026412.mount: Deactivated successfully. May 27 17:42:57.982194 kubelet[2918]: E0527 17:42:57.981523 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dhvg9" podUID="2e20f778-9dbd-4e58-98d4-243c667b05e3" May 27 17:42:58.825392 containerd[1632]: time="2025-05-27T17:42:58.825081876Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:58.834787 containerd[1632]: time="2025-05-27T17:42:58.834746297Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 27 17:42:58.844588 containerd[1632]: time="2025-05-27T17:42:58.844543807Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:58.861842 containerd[1632]: time="2025-05-27T17:42:58.861787051Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:42:58.862220 containerd[1632]: time="2025-05-27T17:42:58.862130417Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 2.511734437s" May 27 17:42:58.862220 containerd[1632]: time="2025-05-27T17:42:58.862151931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 27 17:42:58.863319 containerd[1632]: time="2025-05-27T17:42:58.863127872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 17:42:58.874268 containerd[1632]: time="2025-05-27T17:42:58.874221668Z" level=info msg="CreateContainer within sandbox \"da515c7f4977208bc618b3dfeca01602555ba5f30b5d8350716cd76b55b24620\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 17:42:58.909736 containerd[1632]: time="2025-05-27T17:42:58.909682950Z" level=info msg="Container ce4ff42e923c249f95a82a9fb11d30900e9beee0d72ab6df9cb363f23649b30e: CDI devices from CRI Config.CDIDevices: []" May 27 17:42:58.911527 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount231125243.mount: Deactivated successfully. May 27 17:42:58.919056 containerd[1632]: time="2025-05-27T17:42:58.919021774Z" level=info msg="CreateContainer within sandbox \"da515c7f4977208bc618b3dfeca01602555ba5f30b5d8350716cd76b55b24620\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ce4ff42e923c249f95a82a9fb11d30900e9beee0d72ab6df9cb363f23649b30e\"" May 27 17:42:58.919570 containerd[1632]: time="2025-05-27T17:42:58.919498926Z" level=info msg="StartContainer for \"ce4ff42e923c249f95a82a9fb11d30900e9beee0d72ab6df9cb363f23649b30e\"" May 27 17:42:58.921065 containerd[1632]: time="2025-05-27T17:42:58.921040724Z" level=info msg="connecting to shim ce4ff42e923c249f95a82a9fb11d30900e9beee0d72ab6df9cb363f23649b30e" address="unix:///run/containerd/s/017a4afcaea941da2b497b3c612e904dcb0cc6f0afc41eb3f3c6bee4845686dc" protocol=ttrpc version=3 May 27 17:42:58.940160 systemd[1]: Started cri-containerd-ce4ff42e923c249f95a82a9fb11d30900e9beee0d72ab6df9cb363f23649b30e.scope - libcontainer container ce4ff42e923c249f95a82a9fb11d30900e9beee0d72ab6df9cb363f23649b30e. May 27 17:42:59.009272 containerd[1632]: time="2025-05-27T17:42:59.009122563Z" level=info msg="StartContainer for \"ce4ff42e923c249f95a82a9fb11d30900e9beee0d72ab6df9cb363f23649b30e\" returns successfully" May 27 17:42:59.065366 kubelet[2918]: I0527 17:42:59.065333 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-9d48648b4-92fgx" podStartSLOduration=1.552151289 podStartE2EDuration="4.065317282s" podCreationTimestamp="2025-05-27 17:42:55 +0000 UTC" firstStartedPulling="2025-05-27 17:42:56.349860152 +0000 UTC m=+17.511351294" lastFinishedPulling="2025-05-27 17:42:58.863026147 +0000 UTC m=+20.024517287" observedRunningTime="2025-05-27 17:42:59.065099506 +0000 UTC m=+20.226590651" watchObservedRunningTime="2025-05-27 17:42:59.065317282 +0000 UTC m=+20.226808430" May 27 17:42:59.117074 kubelet[2918]: E0527 17:42:59.116974 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.117074 kubelet[2918]: W0527 17:42:59.117014 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.117074 kubelet[2918]: E0527 17:42:59.117033 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.118809 kubelet[2918]: E0527 17:42:59.117527 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.118809 kubelet[2918]: W0527 17:42:59.117535 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.118809 kubelet[2918]: E0527 17:42:59.117544 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.118809 kubelet[2918]: E0527 17:42:59.117636 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.118809 kubelet[2918]: W0527 17:42:59.117641 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.118809 kubelet[2918]: E0527 17:42:59.117647 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.118809 kubelet[2918]: E0527 17:42:59.117726 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.118809 kubelet[2918]: W0527 17:42:59.117731 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.118809 kubelet[2918]: E0527 17:42:59.117737 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.118809 kubelet[2918]: E0527 17:42:59.117814 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.118982 kubelet[2918]: W0527 17:42:59.117818 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.118982 kubelet[2918]: E0527 17:42:59.117822 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.118982 kubelet[2918]: E0527 17:42:59.117885 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.118982 kubelet[2918]: W0527 17:42:59.117890 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.118982 kubelet[2918]: E0527 17:42:59.117895 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.118982 kubelet[2918]: E0527 17:42:59.117961 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.118982 kubelet[2918]: W0527 17:42:59.117965 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.118982 kubelet[2918]: E0527 17:42:59.117970 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.118982 kubelet[2918]: E0527 17:42:59.118045 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.118982 kubelet[2918]: W0527 17:42:59.118049 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.120506 kubelet[2918]: E0527 17:42:59.118061 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.120506 kubelet[2918]: E0527 17:42:59.118152 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.120506 kubelet[2918]: W0527 17:42:59.118157 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.120506 kubelet[2918]: E0527 17:42:59.118163 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.120506 kubelet[2918]: E0527 17:42:59.118246 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.120506 kubelet[2918]: W0527 17:42:59.118251 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.120506 kubelet[2918]: E0527 17:42:59.118257 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.120506 kubelet[2918]: E0527 17:42:59.118330 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.120506 kubelet[2918]: W0527 17:42:59.118335 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.120506 kubelet[2918]: E0527 17:42:59.118340 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.120710 kubelet[2918]: E0527 17:42:59.118406 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.120710 kubelet[2918]: W0527 17:42:59.118412 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.120710 kubelet[2918]: E0527 17:42:59.118416 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.120710 kubelet[2918]: E0527 17:42:59.118484 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.120710 kubelet[2918]: W0527 17:42:59.118489 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.120710 kubelet[2918]: E0527 17:42:59.118493 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.120710 kubelet[2918]: E0527 17:42:59.118572 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.120710 kubelet[2918]: W0527 17:42:59.118576 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.120710 kubelet[2918]: E0527 17:42:59.118582 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.120710 kubelet[2918]: E0527 17:42:59.118650 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.120926 kubelet[2918]: W0527 17:42:59.118655 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.120926 kubelet[2918]: E0527 17:42:59.118659 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.200829 kubelet[2918]: E0527 17:42:59.200808 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.200956 kubelet[2918]: W0527 17:42:59.200864 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.200956 kubelet[2918]: E0527 17:42:59.200880 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.201087 kubelet[2918]: E0527 17:42:59.201066 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.201143 kubelet[2918]: W0527 17:42:59.201071 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.201143 kubelet[2918]: E0527 17:42:59.201137 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.201803 kubelet[2918]: E0527 17:42:59.201792 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.201803 kubelet[2918]: W0527 17:42:59.201801 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.201803 kubelet[2918]: E0527 17:42:59.201810 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.201954 kubelet[2918]: E0527 17:42:59.201934 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.201954 kubelet[2918]: W0527 17:42:59.201942 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.201954 kubelet[2918]: E0527 17:42:59.201950 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.202097 kubelet[2918]: E0527 17:42:59.202048 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.202097 kubelet[2918]: W0527 17:42:59.202053 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.202142 kubelet[2918]: E0527 17:42:59.202128 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.202142 kubelet[2918]: W0527 17:42:59.202132 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.202228 kubelet[2918]: E0527 17:42:59.202067 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.202228 kubelet[2918]: E0527 17:42:59.202175 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.202228 kubelet[2918]: E0527 17:42:59.202206 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.202228 kubelet[2918]: W0527 17:42:59.202210 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.202228 kubelet[2918]: E0527 17:42:59.202217 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.202400 kubelet[2918]: E0527 17:42:59.202304 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.202400 kubelet[2918]: W0527 17:42:59.202309 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.202400 kubelet[2918]: E0527 17:42:59.202316 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.202400 kubelet[2918]: E0527 17:42:59.202395 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.202400 kubelet[2918]: W0527 17:42:59.202399 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.202538 kubelet[2918]: E0527 17:42:59.202406 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.202538 kubelet[2918]: E0527 17:42:59.202502 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.202538 kubelet[2918]: W0527 17:42:59.202506 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.202538 kubelet[2918]: E0527 17:42:59.202513 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.202764 kubelet[2918]: E0527 17:42:59.202707 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.202764 kubelet[2918]: W0527 17:42:59.202715 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.202764 kubelet[2918]: E0527 17:42:59.202727 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.202900 kubelet[2918]: E0527 17:42:59.202895 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.202937 kubelet[2918]: W0527 17:42:59.202931 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.202978 kubelet[2918]: E0527 17:42:59.202972 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.203217 kubelet[2918]: E0527 17:42:59.203059 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.203217 kubelet[2918]: W0527 17:42:59.203065 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.203217 kubelet[2918]: E0527 17:42:59.203071 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.203217 kubelet[2918]: E0527 17:42:59.203144 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.203217 kubelet[2918]: W0527 17:42:59.203149 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.203217 kubelet[2918]: E0527 17:42:59.203154 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.203337 kubelet[2918]: E0527 17:42:59.203255 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.203337 kubelet[2918]: W0527 17:42:59.203260 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.203337 kubelet[2918]: E0527 17:42:59.203267 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.203427 kubelet[2918]: E0527 17:42:59.203416 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.203427 kubelet[2918]: W0527 17:42:59.203424 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.203427 kubelet[2918]: E0527 17:42:59.203431 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.203594 kubelet[2918]: E0527 17:42:59.203584 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.203594 kubelet[2918]: W0527 17:42:59.203593 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.203649 kubelet[2918]: E0527 17:42:59.203602 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.203697 kubelet[2918]: E0527 17:42:59.203689 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:42:59.203716 kubelet[2918]: W0527 17:42:59.203697 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:42:59.203716 kubelet[2918]: E0527 17:42:59.203703 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:42:59.993011 kubelet[2918]: E0527 17:42:59.992441 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dhvg9" podUID="2e20f778-9dbd-4e58-98d4-243c667b05e3" May 27 17:43:00.047119 kubelet[2918]: I0527 17:43:00.047104 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:00.110270 containerd[1632]: time="2025-05-27T17:43:00.110234421Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:00.110899 containerd[1632]: time="2025-05-27T17:43:00.110873149Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 27 17:43:00.111377 containerd[1632]: time="2025-05-27T17:43:00.111359073Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:00.114901 containerd[1632]: time="2025-05-27T17:43:00.114841447Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:00.116369 containerd[1632]: time="2025-05-27T17:43:00.116346117Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 1.253197504s" May 27 17:43:00.116369 containerd[1632]: time="2025-05-27T17:43:00.116370057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 27 17:43:00.118679 containerd[1632]: time="2025-05-27T17:43:00.118648899Z" level=info msg="CreateContainer within sandbox \"bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 17:43:00.123782 kubelet[2918]: E0527 17:43:00.123763 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.124316 kubelet[2918]: W0527 17:43:00.124009 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.124316 kubelet[2918]: E0527 17:43:00.124026 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.124316 kubelet[2918]: E0527 17:43:00.124119 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.124316 kubelet[2918]: W0527 17:43:00.124124 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.124316 kubelet[2918]: E0527 17:43:00.124129 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.124316 kubelet[2918]: E0527 17:43:00.124240 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.124316 kubelet[2918]: W0527 17:43:00.124245 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.124316 kubelet[2918]: E0527 17:43:00.124250 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.124599 kubelet[2918]: E0527 17:43:00.124533 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.124599 kubelet[2918]: W0527 17:43:00.124541 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.124599 kubelet[2918]: E0527 17:43:00.124549 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.124773 kubelet[2918]: E0527 17:43:00.124722 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.124773 kubelet[2918]: W0527 17:43:00.124730 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.124773 kubelet[2918]: E0527 17:43:00.124736 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.124882 kubelet[2918]: E0527 17:43:00.124876 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.124963 kubelet[2918]: W0527 17:43:00.124914 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.124963 kubelet[2918]: E0527 17:43:00.124924 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.125072 kubelet[2918]: E0527 17:43:00.125066 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.125152 kubelet[2918]: W0527 17:43:00.125101 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.125152 kubelet[2918]: E0527 17:43:00.125109 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.125265 kubelet[2918]: E0527 17:43:00.125258 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.125389 kubelet[2918]: W0527 17:43:00.125296 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.125389 kubelet[2918]: E0527 17:43:00.125304 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.125474 kubelet[2918]: E0527 17:43:00.125468 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.125508 kubelet[2918]: W0527 17:43:00.125503 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.125547 kubelet[2918]: E0527 17:43:00.125538 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.125700 kubelet[2918]: E0527 17:43:00.125652 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.125700 kubelet[2918]: W0527 17:43:00.125659 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.125700 kubelet[2918]: E0527 17:43:00.125664 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.125792 kubelet[2918]: E0527 17:43:00.125786 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.125825 kubelet[2918]: W0527 17:43:00.125820 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.125919 kubelet[2918]: E0527 17:43:00.125879 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.126008 kubelet[2918]: E0527 17:43:00.125985 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.126089 kubelet[2918]: W0527 17:43:00.126041 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.126089 kubelet[2918]: E0527 17:43:00.126049 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.126797 kubelet[2918]: E0527 17:43:00.126169 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.126797 kubelet[2918]: W0527 17:43:00.126707 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.126797 kubelet[2918]: E0527 17:43:00.126723 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.127224 kubelet[2918]: E0527 17:43:00.127207 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.127224 kubelet[2918]: W0527 17:43:00.127218 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.127315 kubelet[2918]: E0527 17:43:00.127226 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.127470 kubelet[2918]: E0527 17:43:00.127455 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:43:00.127470 kubelet[2918]: W0527 17:43:00.127465 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:43:00.127525 kubelet[2918]: E0527 17:43:00.127472 2918 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:43:00.133771 containerd[1632]: time="2025-05-27T17:43:00.132014054Z" level=info msg="Container cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:00.141923 containerd[1632]: time="2025-05-27T17:43:00.141885798Z" level=info msg="CreateContainer within sandbox \"bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025\"" May 27 17:43:00.142972 containerd[1632]: time="2025-05-27T17:43:00.142806727Z" level=info msg="StartContainer for \"cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025\"" May 27 17:43:00.144325 containerd[1632]: time="2025-05-27T17:43:00.144302342Z" level=info msg="connecting to shim cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025" address="unix:///run/containerd/s/95cd56afd76389d2d2c003dac5ec2d8c773c95176151d4af35f62fcc216637cb" protocol=ttrpc version=3 May 27 17:43:00.169112 systemd[1]: Started cri-containerd-cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025.scope - libcontainer container cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025. May 27 17:43:00.196604 containerd[1632]: time="2025-05-27T17:43:00.196583492Z" level=info msg="StartContainer for \"cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025\" returns successfully" May 27 17:43:00.199940 systemd[1]: cri-containerd-cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025.scope: Deactivated successfully. May 27 17:43:00.240791 containerd[1632]: time="2025-05-27T17:43:00.240746507Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025\" id:\"cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025\" pid:3592 exited_at:{seconds:1748367780 nanos:201875092}" May 27 17:43:00.251442 containerd[1632]: time="2025-05-27T17:43:00.251367915Z" level=info msg="received exit event container_id:\"cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025\" id:\"cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025\" pid:3592 exited_at:{seconds:1748367780 nanos:201875092}" May 27 17:43:00.268399 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cdce1bd5ae03238efa3ac0428f8e77b8e2f3c44ef4e3927d1888b7d0174da025-rootfs.mount: Deactivated successfully. May 27 17:43:01.050608 containerd[1632]: time="2025-05-27T17:43:01.050552332Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 17:43:01.980821 kubelet[2918]: E0527 17:43:01.980788 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dhvg9" podUID="2e20f778-9dbd-4e58-98d4-243c667b05e3" May 27 17:43:03.980906 kubelet[2918]: E0527 17:43:03.980861 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dhvg9" podUID="2e20f778-9dbd-4e58-98d4-243c667b05e3" May 27 17:43:04.520750 containerd[1632]: time="2025-05-27T17:43:04.520647031Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:04.532734 containerd[1632]: time="2025-05-27T17:43:04.532699309Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 27 17:43:04.536141 containerd[1632]: time="2025-05-27T17:43:04.536098189Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:04.541858 containerd[1632]: time="2025-05-27T17:43:04.541810543Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:04.542600 containerd[1632]: time="2025-05-27T17:43:04.542307089Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 3.49170275s" May 27 17:43:04.542600 containerd[1632]: time="2025-05-27T17:43:04.542330541Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 27 17:43:04.544910 containerd[1632]: time="2025-05-27T17:43:04.544663405Z" level=info msg="CreateContainer within sandbox \"bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 17:43:04.574398 containerd[1632]: time="2025-05-27T17:43:04.574376745Z" level=info msg="Container 925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:04.643443 containerd[1632]: time="2025-05-27T17:43:04.643412265Z" level=info msg="CreateContainer within sandbox \"bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44\"" May 27 17:43:04.677451 containerd[1632]: time="2025-05-27T17:43:04.677200697Z" level=info msg="StartContainer for \"925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44\"" May 27 17:43:04.682183 containerd[1632]: time="2025-05-27T17:43:04.679325016Z" level=info msg="connecting to shim 925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44" address="unix:///run/containerd/s/95cd56afd76389d2d2c003dac5ec2d8c773c95176151d4af35f62fcc216637cb" protocol=ttrpc version=3 May 27 17:43:04.710166 systemd[1]: Started cri-containerd-925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44.scope - libcontainer container 925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44. May 27 17:43:04.744289 containerd[1632]: time="2025-05-27T17:43:04.744259978Z" level=info msg="StartContainer for \"925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44\" returns successfully" May 27 17:43:05.508104 kubelet[2918]: I0527 17:43:05.508021 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:05.981601 kubelet[2918]: E0527 17:43:05.981458 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dhvg9" podUID="2e20f778-9dbd-4e58-98d4-243c667b05e3" May 27 17:43:06.718226 systemd[1]: cri-containerd-925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44.scope: Deactivated successfully. May 27 17:43:06.718677 systemd[1]: cri-containerd-925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44.scope: Consumed 369ms CPU time, 163.2M memory peak, 2.1M read from disk, 170.9M written to disk. May 27 17:43:06.769569 containerd[1632]: time="2025-05-27T17:43:06.769466556Z" level=info msg="received exit event container_id:\"925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44\" id:\"925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44\" pid:3654 exited_at:{seconds:1748367786 nanos:760524169}" May 27 17:43:06.772277 containerd[1632]: time="2025-05-27T17:43:06.772250100Z" level=info msg="TaskExit event in podsandbox handler container_id:\"925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44\" id:\"925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44\" pid:3654 exited_at:{seconds:1748367786 nanos:760524169}" May 27 17:43:06.834018 kubelet[2918]: I0527 17:43:06.833977 2918 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 27 17:43:06.881016 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-925e2dd4a3c2ac901ba1e910021672deead0a4cf4bf19a36d7d4a1f50950cb44-rootfs.mount: Deactivated successfully. May 27 17:43:06.953809 systemd[1]: Created slice kubepods-burstable-podcf215349_0148_4cc0_8d29_84c0a329c3d7.slice - libcontainer container kubepods-burstable-podcf215349_0148_4cc0_8d29_84c0a329c3d7.slice. May 27 17:43:06.958284 systemd[1]: Created slice kubepods-burstable-pod25db2a47_8658_4006_a22d_dc476e7c2f7b.slice - libcontainer container kubepods-burstable-pod25db2a47_8658_4006_a22d_dc476e7c2f7b.slice. May 27 17:43:06.964244 systemd[1]: Created slice kubepods-besteffort-pod82fd1cbc_6be0_4a3a_8294_e11be1d04cd9.slice - libcontainer container kubepods-besteffort-pod82fd1cbc_6be0_4a3a_8294_e11be1d04cd9.slice. May 27 17:43:06.970572 systemd[1]: Created slice kubepods-besteffort-podf2e9e0e3_b52a_4cbc_87d0_04cf8f359b7d.slice - libcontainer container kubepods-besteffort-podf2e9e0e3_b52a_4cbc_87d0_04cf8f359b7d.slice. May 27 17:43:06.976408 systemd[1]: Created slice kubepods-besteffort-pod1db7bd7d_cd20_46eb_a3b9_8fab3073829e.slice - libcontainer container kubepods-besteffort-pod1db7bd7d_cd20_46eb_a3b9_8fab3073829e.slice. May 27 17:43:06.983635 systemd[1]: Created slice kubepods-besteffort-pod447bf712_e904_4fbf_928f_7f18d872264d.slice - libcontainer container kubepods-besteffort-pod447bf712_e904_4fbf_928f_7f18d872264d.slice. May 27 17:43:06.987894 systemd[1]: Created slice kubepods-besteffort-pod736fe7a0_4e4b_481d_b725_c57000011d7a.slice - libcontainer container kubepods-besteffort-pod736fe7a0_4e4b_481d_b725_c57000011d7a.slice. May 27 17:43:06.991208 systemd[1]: Created slice kubepods-besteffort-pod80af4ccd_d2ea_4a9d_ba83_54dcaaf7032a.slice - libcontainer container kubepods-besteffort-pod80af4ccd_d2ea_4a9d_ba83_54dcaaf7032a.slice. May 27 17:43:06.991944 kubelet[2918]: I0527 17:43:06.991931 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/447bf712-e904-4fbf-928f-7f18d872264d-tigera-ca-bundle\") pod \"calico-kube-controllers-8499647769-j5wlf\" (UID: \"447bf712-e904-4fbf-928f-7f18d872264d\") " pod="calico-system/calico-kube-controllers-8499647769-j5wlf" May 27 17:43:06.992048 kubelet[2918]: I0527 17:43:06.992039 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9-calico-apiserver-certs\") pod \"calico-apiserver-657c668ffb-nhkv6\" (UID: \"82fd1cbc-6be0-4a3a-8294-e11be1d04cd9\") " pod="calico-apiserver/calico-apiserver-657c668ffb-nhkv6" May 27 17:43:06.992170 kubelet[2918]: I0527 17:43:06.992088 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d-goldmane-ca-bundle\") pod \"goldmane-8f77d7b6c-vtwqz\" (UID: \"f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d\") " pod="calico-system/goldmane-8f77d7b6c-vtwqz" May 27 17:43:06.992170 kubelet[2918]: I0527 17:43:06.992102 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf215349-0148-4cc0-8d29-84c0a329c3d7-config-volume\") pod \"coredns-7c65d6cfc9-bbx54\" (UID: \"cf215349-0148-4cc0-8d29-84c0a329c3d7\") " pod="kube-system/coredns-7c65d6cfc9-bbx54" May 27 17:43:06.992170 kubelet[2918]: I0527 17:43:06.992112 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd2tv\" (UniqueName: \"kubernetes.io/projected/cf215349-0148-4cc0-8d29-84c0a329c3d7-kube-api-access-xd2tv\") pod \"coredns-7c65d6cfc9-bbx54\" (UID: \"cf215349-0148-4cc0-8d29-84c0a329c3d7\") " pod="kube-system/coredns-7c65d6cfc9-bbx54" May 27 17:43:06.994162 kubelet[2918]: I0527 17:43:06.993018 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d-goldmane-key-pair\") pod \"goldmane-8f77d7b6c-vtwqz\" (UID: \"f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d\") " pod="calico-system/goldmane-8f77d7b6c-vtwqz" May 27 17:43:06.994162 kubelet[2918]: I0527 17:43:06.993068 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtz2z\" (UniqueName: \"kubernetes.io/projected/1db7bd7d-cd20-46eb-a3b9-8fab3073829e-kube-api-access-qtz2z\") pod \"calico-apiserver-657c668ffb-m5qhs\" (UID: \"1db7bd7d-cd20-46eb-a3b9-8fab3073829e\") " pod="calico-apiserver/calico-apiserver-657c668ffb-m5qhs" May 27 17:43:06.994162 kubelet[2918]: I0527 17:43:06.993106 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdh68\" (UniqueName: \"kubernetes.io/projected/736fe7a0-4e4b-481d-b725-c57000011d7a-kube-api-access-mdh68\") pod \"calico-apiserver-7965f8d8cd-mnv87\" (UID: \"736fe7a0-4e4b-481d-b725-c57000011d7a\") " pod="calico-apiserver/calico-apiserver-7965f8d8cd-mnv87" May 27 17:43:06.994162 kubelet[2918]: I0527 17:43:06.993259 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25db2a47-8658-4006-a22d-dc476e7c2f7b-config-volume\") pod \"coredns-7c65d6cfc9-2tczc\" (UID: \"25db2a47-8658-4006-a22d-dc476e7c2f7b\") " pod="kube-system/coredns-7c65d6cfc9-2tczc" May 27 17:43:06.994162 kubelet[2918]: I0527 17:43:06.993284 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94m6w\" (UniqueName: \"kubernetes.io/projected/25db2a47-8658-4006-a22d-dc476e7c2f7b-kube-api-access-94m6w\") pod \"coredns-7c65d6cfc9-2tczc\" (UID: \"25db2a47-8658-4006-a22d-dc476e7c2f7b\") " pod="kube-system/coredns-7c65d6cfc9-2tczc" May 27 17:43:07.000670 kubelet[2918]: I0527 17:43:06.993301 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-whisker-ca-bundle\") pod \"whisker-77bdbdbd8-4hdr7\" (UID: \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\") " pod="calico-system/whisker-77bdbdbd8-4hdr7" May 27 17:43:07.000670 kubelet[2918]: I0527 17:43:06.993406 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8fkg\" (UniqueName: \"kubernetes.io/projected/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9-kube-api-access-z8fkg\") pod \"calico-apiserver-657c668ffb-nhkv6\" (UID: \"82fd1cbc-6be0-4a3a-8294-e11be1d04cd9\") " pod="calico-apiserver/calico-apiserver-657c668ffb-nhkv6" May 27 17:43:07.000670 kubelet[2918]: I0527 17:43:06.993561 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d-config\") pod \"goldmane-8f77d7b6c-vtwqz\" (UID: \"f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d\") " pod="calico-system/goldmane-8f77d7b6c-vtwqz" May 27 17:43:07.000670 kubelet[2918]: I0527 17:43:06.993587 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbq6t\" (UniqueName: \"kubernetes.io/projected/447bf712-e904-4fbf-928f-7f18d872264d-kube-api-access-wbq6t\") pod \"calico-kube-controllers-8499647769-j5wlf\" (UID: \"447bf712-e904-4fbf-928f-7f18d872264d\") " pod="calico-system/calico-kube-controllers-8499647769-j5wlf" May 27 17:43:07.000670 kubelet[2918]: I0527 17:43:06.993609 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-whisker-backend-key-pair\") pod \"whisker-77bdbdbd8-4hdr7\" (UID: \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\") " pod="calico-system/whisker-77bdbdbd8-4hdr7" May 27 17:43:07.000766 kubelet[2918]: I0527 17:43:06.993746 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/736fe7a0-4e4b-481d-b725-c57000011d7a-calico-apiserver-certs\") pod \"calico-apiserver-7965f8d8cd-mnv87\" (UID: \"736fe7a0-4e4b-481d-b725-c57000011d7a\") " pod="calico-apiserver/calico-apiserver-7965f8d8cd-mnv87" May 27 17:43:07.000766 kubelet[2918]: I0527 17:43:06.993869 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1db7bd7d-cd20-46eb-a3b9-8fab3073829e-calico-apiserver-certs\") pod \"calico-apiserver-657c668ffb-m5qhs\" (UID: \"1db7bd7d-cd20-46eb-a3b9-8fab3073829e\") " pod="calico-apiserver/calico-apiserver-657c668ffb-m5qhs" May 27 17:43:07.000766 kubelet[2918]: I0527 17:43:06.993890 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkvbr\" (UniqueName: \"kubernetes.io/projected/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-kube-api-access-zkvbr\") pod \"whisker-77bdbdbd8-4hdr7\" (UID: \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\") " pod="calico-system/whisker-77bdbdbd8-4hdr7" May 27 17:43:07.000766 kubelet[2918]: I0527 17:43:06.993903 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlq8p\" (UniqueName: \"kubernetes.io/projected/f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d-kube-api-access-mlq8p\") pod \"goldmane-8f77d7b6c-vtwqz\" (UID: \"f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d\") " pod="calico-system/goldmane-8f77d7b6c-vtwqz" May 27 17:43:07.098668 containerd[1632]: time="2025-05-27T17:43:07.098308458Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 17:43:07.261877 containerd[1632]: time="2025-05-27T17:43:07.260823875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2tczc,Uid:25db2a47-8658-4006-a22d-dc476e7c2f7b,Namespace:kube-system,Attempt:0,}" May 27 17:43:07.264159 containerd[1632]: time="2025-05-27T17:43:07.263943651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-bbx54,Uid:cf215349-0148-4cc0-8d29-84c0a329c3d7,Namespace:kube-system,Attempt:0,}" May 27 17:43:07.269749 containerd[1632]: time="2025-05-27T17:43:07.269152056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-657c668ffb-nhkv6,Uid:82fd1cbc-6be0-4a3a-8294-e11be1d04cd9,Namespace:calico-apiserver,Attempt:0,}" May 27 17:43:07.276743 containerd[1632]: time="2025-05-27T17:43:07.276634923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-vtwqz,Uid:f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d,Namespace:calico-system,Attempt:0,}" May 27 17:43:07.284019 containerd[1632]: time="2025-05-27T17:43:07.283950843Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-657c668ffb-m5qhs,Uid:1db7bd7d-cd20-46eb-a3b9-8fab3073829e,Namespace:calico-apiserver,Attempt:0,}" May 27 17:43:07.288285 containerd[1632]: time="2025-05-27T17:43:07.288257645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8499647769-j5wlf,Uid:447bf712-e904-4fbf-928f-7f18d872264d,Namespace:calico-system,Attempt:0,}" May 27 17:43:07.291247 containerd[1632]: time="2025-05-27T17:43:07.291069081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7965f8d8cd-mnv87,Uid:736fe7a0-4e4b-481d-b725-c57000011d7a,Namespace:calico-apiserver,Attempt:0,}" May 27 17:43:07.298860 containerd[1632]: time="2025-05-27T17:43:07.298827267Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77bdbdbd8-4hdr7,Uid:80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a,Namespace:calico-system,Attempt:0,}" May 27 17:43:07.720515 containerd[1632]: time="2025-05-27T17:43:07.720418077Z" level=error msg="Failed to destroy network for sandbox \"81fb1c86ddef77bcb6e0bbdfc1ad9519d89b3919db27409db5f5ec1fcf952b50\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.721072 containerd[1632]: time="2025-05-27T17:43:07.720968016Z" level=error msg="Failed to destroy network for sandbox \"7f150fa9e781f66cea163f9f799e0adeaa3330940538004e42a9467a95057cda\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.721434 containerd[1632]: time="2025-05-27T17:43:07.721412781Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7965f8d8cd-mnv87,Uid:736fe7a0-4e4b-481d-b725-c57000011d7a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"81fb1c86ddef77bcb6e0bbdfc1ad9519d89b3919db27409db5f5ec1fcf952b50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.722146 containerd[1632]: time="2025-05-27T17:43:07.722127420Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8499647769-j5wlf,Uid:447bf712-e904-4fbf-928f-7f18d872264d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f150fa9e781f66cea163f9f799e0adeaa3330940538004e42a9467a95057cda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.724680 containerd[1632]: time="2025-05-27T17:43:07.724524058Z" level=error msg="Failed to destroy network for sandbox \"5f15ea59bdf5b24940ebb24009f0f2c54c973871f6693fd656b924e4447735d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.726693 kubelet[2918]: E0527 17:43:07.726559 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81fb1c86ddef77bcb6e0bbdfc1ad9519d89b3919db27409db5f5ec1fcf952b50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.726693 kubelet[2918]: E0527 17:43:07.726672 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f150fa9e781f66cea163f9f799e0adeaa3330940538004e42a9467a95057cda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.728617 containerd[1632]: time="2025-05-27T17:43:07.728577709Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-657c668ffb-nhkv6,Uid:82fd1cbc-6be0-4a3a-8294-e11be1d04cd9,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f15ea59bdf5b24940ebb24009f0f2c54c973871f6693fd656b924e4447735d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.729957 containerd[1632]: time="2025-05-27T17:43:07.729456135Z" level=error msg="Failed to destroy network for sandbox \"6565728904749f90277244636c88dd1b97b45280b6681ff942f1947bbd49be21\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.731696 kubelet[2918]: E0527 17:43:07.731494 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f150fa9e781f66cea163f9f799e0adeaa3330940538004e42a9467a95057cda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8499647769-j5wlf" May 27 17:43:07.731696 kubelet[2918]: E0527 17:43:07.731526 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f150fa9e781f66cea163f9f799e0adeaa3330940538004e42a9467a95057cda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8499647769-j5wlf" May 27 17:43:07.731696 kubelet[2918]: E0527 17:43:07.731619 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81fb1c86ddef77bcb6e0bbdfc1ad9519d89b3919db27409db5f5ec1fcf952b50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7965f8d8cd-mnv87" May 27 17:43:07.731696 kubelet[2918]: E0527 17:43:07.731632 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81fb1c86ddef77bcb6e0bbdfc1ad9519d89b3919db27409db5f5ec1fcf952b50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7965f8d8cd-mnv87" May 27 17:43:07.731826 containerd[1632]: time="2025-05-27T17:43:07.729491474Z" level=error msg="Failed to destroy network for sandbox \"d1241e03dc9380ab0b321de4faeb98410c4165f8f2b2f7d44b2b66be3b35b6c4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.731902 containerd[1632]: time="2025-05-27T17:43:07.731888640Z" level=error msg="Failed to destroy network for sandbox \"5105e4c67fa1dbbce26293ec268cfea4a27cf1332a86dc8b753fd1dd4c7a0a62\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.732027 containerd[1632]: time="2025-05-27T17:43:07.731407645Z" level=error msg="Failed to destroy network for sandbox \"d0ef05acb2e5a88dcaafde4496f05aa5440e2728daa05baecd8bd29fe88e5329\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.732097 containerd[1632]: time="2025-05-27T17:43:07.731463560Z" level=error msg="Failed to destroy network for sandbox \"0b50a163a45fb6c36f05d5b0bcd9eb3c8025f70818c2b2662e1cdd0bbdf23fff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.732776 kubelet[2918]: E0527 17:43:07.732748 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7965f8d8cd-mnv87_calico-apiserver(736fe7a0-4e4b-481d-b725-c57000011d7a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7965f8d8cd-mnv87_calico-apiserver(736fe7a0-4e4b-481d-b725-c57000011d7a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"81fb1c86ddef77bcb6e0bbdfc1ad9519d89b3919db27409db5f5ec1fcf952b50\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7965f8d8cd-mnv87" podUID="736fe7a0-4e4b-481d-b725-c57000011d7a" May 27 17:43:07.733139 containerd[1632]: time="2025-05-27T17:43:07.732904756Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2tczc,Uid:25db2a47-8658-4006-a22d-dc476e7c2f7b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6565728904749f90277244636c88dd1b97b45280b6681ff942f1947bbd49be21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.733239 kubelet[2918]: E0527 17:43:07.733220 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8499647769-j5wlf_calico-system(447bf712-e904-4fbf-928f-7f18d872264d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8499647769-j5wlf_calico-system(447bf712-e904-4fbf-928f-7f18d872264d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7f150fa9e781f66cea163f9f799e0adeaa3330940538004e42a9467a95057cda\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8499647769-j5wlf" podUID="447bf712-e904-4fbf-928f-7f18d872264d" May 27 17:43:07.733365 kubelet[2918]: E0527 17:43:07.733354 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6565728904749f90277244636c88dd1b97b45280b6681ff942f1947bbd49be21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.733418 kubelet[2918]: E0527 17:43:07.733409 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6565728904749f90277244636c88dd1b97b45280b6681ff942f1947bbd49be21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-2tczc" May 27 17:43:07.733459 kubelet[2918]: E0527 17:43:07.733452 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6565728904749f90277244636c88dd1b97b45280b6681ff942f1947bbd49be21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-2tczc" May 27 17:43:07.733511 kubelet[2918]: E0527 17:43:07.733499 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-2tczc_kube-system(25db2a47-8658-4006-a22d-dc476e7c2f7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-2tczc_kube-system(25db2a47-8658-4006-a22d-dc476e7c2f7b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6565728904749f90277244636c88dd1b97b45280b6681ff942f1947bbd49be21\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-2tczc" podUID="25db2a47-8658-4006-a22d-dc476e7c2f7b" May 27 17:43:07.733626 kubelet[2918]: E0527 17:43:07.733567 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f15ea59bdf5b24940ebb24009f0f2c54c973871f6693fd656b924e4447735d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.733626 kubelet[2918]: E0527 17:43:07.733583 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f15ea59bdf5b24940ebb24009f0f2c54c973871f6693fd656b924e4447735d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-657c668ffb-nhkv6" May 27 17:43:07.733626 kubelet[2918]: E0527 17:43:07.733591 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f15ea59bdf5b24940ebb24009f0f2c54c973871f6693fd656b924e4447735d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-657c668ffb-nhkv6" May 27 17:43:07.733692 kubelet[2918]: E0527 17:43:07.733604 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-657c668ffb-nhkv6_calico-apiserver(82fd1cbc-6be0-4a3a-8294-e11be1d04cd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-657c668ffb-nhkv6_calico-apiserver(82fd1cbc-6be0-4a3a-8294-e11be1d04cd9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f15ea59bdf5b24940ebb24009f0f2c54c973871f6693fd656b924e4447735d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-657c668ffb-nhkv6" podUID="82fd1cbc-6be0-4a3a-8294-e11be1d04cd9" May 27 17:43:07.733939 kubelet[2918]: E0527 17:43:07.733795 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1241e03dc9380ab0b321de4faeb98410c4165f8f2b2f7d44b2b66be3b35b6c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.733939 kubelet[2918]: E0527 17:43:07.733809 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1241e03dc9380ab0b321de4faeb98410c4165f8f2b2f7d44b2b66be3b35b6c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-vtwqz" May 27 17:43:07.733939 kubelet[2918]: E0527 17:43:07.733818 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1241e03dc9380ab0b321de4faeb98410c4165f8f2b2f7d44b2b66be3b35b6c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-vtwqz" May 27 17:43:07.734086 containerd[1632]: time="2025-05-27T17:43:07.733690247Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-vtwqz,Uid:f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1241e03dc9380ab0b321de4faeb98410c4165f8f2b2f7d44b2b66be3b35b6c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.734126 kubelet[2918]: E0527 17:43:07.733835 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-8f77d7b6c-vtwqz_calico-system(f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-8f77d7b6c-vtwqz_calico-system(f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1241e03dc9380ab0b321de4faeb98410c4165f8f2b2f7d44b2b66be3b35b6c4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:43:07.734621 containerd[1632]: time="2025-05-27T17:43:07.734552102Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-bbx54,Uid:cf215349-0148-4cc0-8d29-84c0a329c3d7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0ef05acb2e5a88dcaafde4496f05aa5440e2728daa05baecd8bd29fe88e5329\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.734732 kubelet[2918]: E0527 17:43:07.734715 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0ef05acb2e5a88dcaafde4496f05aa5440e2728daa05baecd8bd29fe88e5329\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.734787 kubelet[2918]: E0527 17:43:07.734777 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0ef05acb2e5a88dcaafde4496f05aa5440e2728daa05baecd8bd29fe88e5329\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-bbx54" May 27 17:43:07.734872 kubelet[2918]: E0527 17:43:07.734827 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0ef05acb2e5a88dcaafde4496f05aa5440e2728daa05baecd8bd29fe88e5329\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-bbx54" May 27 17:43:07.734872 kubelet[2918]: E0527 17:43:07.734850 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-bbx54_kube-system(cf215349-0148-4cc0-8d29-84c0a329c3d7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-bbx54_kube-system(cf215349-0148-4cc0-8d29-84c0a329c3d7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d0ef05acb2e5a88dcaafde4496f05aa5440e2728daa05baecd8bd29fe88e5329\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-bbx54" podUID="cf215349-0148-4cc0-8d29-84c0a329c3d7" May 27 17:43:07.734935 containerd[1632]: time="2025-05-27T17:43:07.734834597Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77bdbdbd8-4hdr7,Uid:80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b50a163a45fb6c36f05d5b0bcd9eb3c8025f70818c2b2662e1cdd0bbdf23fff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.735054 kubelet[2918]: E0527 17:43:07.735044 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b50a163a45fb6c36f05d5b0bcd9eb3c8025f70818c2b2662e1cdd0bbdf23fff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.735142 kubelet[2918]: E0527 17:43:07.735098 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b50a163a45fb6c36f05d5b0bcd9eb3c8025f70818c2b2662e1cdd0bbdf23fff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-77bdbdbd8-4hdr7" May 27 17:43:07.735142 kubelet[2918]: E0527 17:43:07.735109 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b50a163a45fb6c36f05d5b0bcd9eb3c8025f70818c2b2662e1cdd0bbdf23fff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-77bdbdbd8-4hdr7" May 27 17:43:07.735142 kubelet[2918]: E0527 17:43:07.735126 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-77bdbdbd8-4hdr7_calico-system(80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-77bdbdbd8-4hdr7_calico-system(80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b50a163a45fb6c36f05d5b0bcd9eb3c8025f70818c2b2662e1cdd0bbdf23fff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-77bdbdbd8-4hdr7" podUID="80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a" May 27 17:43:07.735500 containerd[1632]: time="2025-05-27T17:43:07.735480371Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-657c668ffb-m5qhs,Uid:1db7bd7d-cd20-46eb-a3b9-8fab3073829e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5105e4c67fa1dbbce26293ec268cfea4a27cf1332a86dc8b753fd1dd4c7a0a62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.735589 kubelet[2918]: E0527 17:43:07.735578 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5105e4c67fa1dbbce26293ec268cfea4a27cf1332a86dc8b753fd1dd4c7a0a62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:07.735647 kubelet[2918]: E0527 17:43:07.735630 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5105e4c67fa1dbbce26293ec268cfea4a27cf1332a86dc8b753fd1dd4c7a0a62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-657c668ffb-m5qhs" May 27 17:43:07.735688 kubelet[2918]: E0527 17:43:07.735679 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5105e4c67fa1dbbce26293ec268cfea4a27cf1332a86dc8b753fd1dd4c7a0a62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-657c668ffb-m5qhs" May 27 17:43:07.735746 kubelet[2918]: E0527 17:43:07.735735 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-657c668ffb-m5qhs_calico-apiserver(1db7bd7d-cd20-46eb-a3b9-8fab3073829e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-657c668ffb-m5qhs_calico-apiserver(1db7bd7d-cd20-46eb-a3b9-8fab3073829e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5105e4c67fa1dbbce26293ec268cfea4a27cf1332a86dc8b753fd1dd4c7a0a62\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-657c668ffb-m5qhs" podUID="1db7bd7d-cd20-46eb-a3b9-8fab3073829e" May 27 17:43:07.881007 systemd[1]: run-netns-cni\x2deac46d69\x2dc527\x2d665c\x2d61fa\x2d805417d1894c.mount: Deactivated successfully. May 27 17:43:07.881082 systemd[1]: run-netns-cni\x2d9eb1fc16\x2dc18c\x2db5ad\x2d80e5\x2d96767d137128.mount: Deactivated successfully. May 27 17:43:08.001428 systemd[1]: Created slice kubepods-besteffort-pod2e20f778_9dbd_4e58_98d4_243c667b05e3.slice - libcontainer container kubepods-besteffort-pod2e20f778_9dbd_4e58_98d4_243c667b05e3.slice. May 27 17:43:08.008490 containerd[1632]: time="2025-05-27T17:43:08.008460245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dhvg9,Uid:2e20f778-9dbd-4e58-98d4-243c667b05e3,Namespace:calico-system,Attempt:0,}" May 27 17:43:08.047624 containerd[1632]: time="2025-05-27T17:43:08.047592783Z" level=error msg="Failed to destroy network for sandbox \"ed27fd7b87c710bea3b2f16f690082e8ce7919806ea0500687d205d74173d470\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:08.048961 systemd[1]: run-netns-cni\x2d45e9bc16\x2d2d62\x2d791c\x2d400b\x2dab153cecc185.mount: Deactivated successfully. May 27 17:43:08.050089 containerd[1632]: time="2025-05-27T17:43:08.049988889Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dhvg9,Uid:2e20f778-9dbd-4e58-98d4-243c667b05e3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed27fd7b87c710bea3b2f16f690082e8ce7919806ea0500687d205d74173d470\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:08.050201 kubelet[2918]: E0527 17:43:08.050173 2918 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed27fd7b87c710bea3b2f16f690082e8ce7919806ea0500687d205d74173d470\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:43:08.050403 kubelet[2918]: E0527 17:43:08.050214 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed27fd7b87c710bea3b2f16f690082e8ce7919806ea0500687d205d74173d470\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dhvg9" May 27 17:43:08.050403 kubelet[2918]: E0527 17:43:08.050233 2918 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed27fd7b87c710bea3b2f16f690082e8ce7919806ea0500687d205d74173d470\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dhvg9" May 27 17:43:08.050403 kubelet[2918]: E0527 17:43:08.050258 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dhvg9_calico-system(2e20f778-9dbd-4e58-98d4-243c667b05e3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dhvg9_calico-system(2e20f778-9dbd-4e58-98d4-243c667b05e3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed27fd7b87c710bea3b2f16f690082e8ce7919806ea0500687d205d74173d470\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dhvg9" podUID="2e20f778-9dbd-4e58-98d4-243c667b05e3" May 27 17:43:11.436269 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3032148148.mount: Deactivated successfully. May 27 17:43:11.511669 containerd[1632]: time="2025-05-27T17:43:11.507383941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:11.517696 containerd[1632]: time="2025-05-27T17:43:11.517578847Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 27 17:43:11.520509 containerd[1632]: time="2025-05-27T17:43:11.519640185Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:11.525344 containerd[1632]: time="2025-05-27T17:43:11.521906223Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:11.525495 containerd[1632]: time="2025-05-27T17:43:11.524501427Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 4.424868948s" May 27 17:43:11.525881 containerd[1632]: time="2025-05-27T17:43:11.525870481Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 27 17:43:11.550756 containerd[1632]: time="2025-05-27T17:43:11.550685341Z" level=info msg="CreateContainer within sandbox \"bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 17:43:11.567297 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1143468126.mount: Deactivated successfully. May 27 17:43:11.567410 containerd[1632]: time="2025-05-27T17:43:11.567372572Z" level=info msg="Container 9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:11.607744 containerd[1632]: time="2025-05-27T17:43:11.607713176Z" level=info msg="CreateContainer within sandbox \"bdd450b7530cdf6945b3efcb35beaf4a21c9b7622f78bf8721420ea22b229c47\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24\"" May 27 17:43:11.608128 containerd[1632]: time="2025-05-27T17:43:11.608103209Z" level=info msg="StartContainer for \"9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24\"" May 27 17:43:11.610594 containerd[1632]: time="2025-05-27T17:43:11.610569779Z" level=info msg="connecting to shim 9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24" address="unix:///run/containerd/s/95cd56afd76389d2d2c003dac5ec2d8c773c95176151d4af35f62fcc216637cb" protocol=ttrpc version=3 May 27 17:43:11.764110 systemd[1]: Started cri-containerd-9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24.scope - libcontainer container 9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24. May 27 17:43:11.974084 containerd[1632]: time="2025-05-27T17:43:11.973989148Z" level=info msg="StartContainer for \"9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24\" returns successfully" May 27 17:43:12.292548 kubelet[2918]: I0527 17:43:12.292483 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-7g2n8" podStartSLOduration=2.30894795 podStartE2EDuration="17.292470506s" podCreationTimestamp="2025-05-27 17:42:55 +0000 UTC" firstStartedPulling="2025-05-27 17:42:56.543808864 +0000 UTC m=+17.705300001" lastFinishedPulling="2025-05-27 17:43:11.52733142 +0000 UTC m=+32.688822557" observedRunningTime="2025-05-27 17:43:12.291831205 +0000 UTC m=+33.453322351" watchObservedRunningTime="2025-05-27 17:43:12.292470506 +0000 UTC m=+33.453961652" May 27 17:43:12.756045 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 17:43:12.757586 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 17:43:12.773021 containerd[1632]: time="2025-05-27T17:43:12.772809588Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24\" id:\"0cc9ce05964769948e3eba997225f799558d2a94423746ce3fb3e19ab1905344\" pid:3991 exit_status:1 exited_at:{seconds:1748367792 nanos:763449285}" May 27 17:43:13.180301 containerd[1632]: time="2025-05-27T17:43:13.180214135Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24\" id:\"9ae2ff8c3fed4a838ab1cc1811db29b0785f47888398f6ec1a634e2c4ea2e242\" pid:4043 exit_status:1 exited_at:{seconds:1748367793 nanos:179948868}" May 27 17:43:13.334680 kubelet[2918]: I0527 17:43:13.334655 2918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-whisker-ca-bundle\") pod \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\" (UID: \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\") " May 27 17:43:13.335326 kubelet[2918]: I0527 17:43:13.335114 2918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-whisker-backend-key-pair\") pod \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\" (UID: \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\") " May 27 17:43:13.335326 kubelet[2918]: I0527 17:43:13.335141 2918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvbr\" (UniqueName: \"kubernetes.io/projected/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-kube-api-access-zkvbr\") pod \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\" (UID: \"80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a\") " May 27 17:43:13.335326 kubelet[2918]: I0527 17:43:13.334865 2918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a" (UID: "80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 27 17:43:13.335326 kubelet[2918]: I0527 17:43:13.335218 2918 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" May 27 17:43:13.338840 kubelet[2918]: I0527 17:43:13.338794 2918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a" (UID: "80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" May 27 17:43:13.339556 kubelet[2918]: I0527 17:43:13.339529 2918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-kube-api-access-zkvbr" (OuterVolumeSpecName: "kube-api-access-zkvbr") pod "80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a" (UID: "80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a"). InnerVolumeSpecName "kube-api-access-zkvbr". PluginName "kubernetes.io/projected", VolumeGidValue "" May 27 17:43:13.339878 systemd[1]: var-lib-kubelet-pods-80af4ccd\x2dd2ea\x2d4a9d\x2dba83\x2d54dcaaf7032a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 17:43:13.342353 systemd[1]: var-lib-kubelet-pods-80af4ccd\x2dd2ea\x2d4a9d\x2dba83\x2d54dcaaf7032a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzkvbr.mount: Deactivated successfully. May 27 17:43:13.436044 kubelet[2918]: I0527 17:43:13.435935 2918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvbr\" (UniqueName: \"kubernetes.io/projected/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-kube-api-access-zkvbr\") on node \"localhost\" DevicePath \"\"" May 27 17:43:13.436044 kubelet[2918]: I0527 17:43:13.435958 2918 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" May 27 17:43:14.111243 systemd[1]: Removed slice kubepods-besteffort-pod80af4ccd_d2ea_4a9d_ba83_54dcaaf7032a.slice - libcontainer container kubepods-besteffort-pod80af4ccd_d2ea_4a9d_ba83_54dcaaf7032a.slice. May 27 17:43:14.699960 systemd[1]: Created slice kubepods-besteffort-pod2b64553a_fefe_49c7_909d_c50ce828eef7.slice - libcontainer container kubepods-besteffort-pod2b64553a_fefe_49c7_909d_c50ce828eef7.slice. May 27 17:43:14.873628 kubelet[2918]: I0527 17:43:14.873590 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2b64553a-fefe-49c7-909d-c50ce828eef7-whisker-backend-key-pair\") pod \"whisker-84f848d4b6-xtg49\" (UID: \"2b64553a-fefe-49c7-909d-c50ce828eef7\") " pod="calico-system/whisker-84f848d4b6-xtg49" May 27 17:43:14.873628 kubelet[2918]: I0527 17:43:14.873630 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b64553a-fefe-49c7-909d-c50ce828eef7-whisker-ca-bundle\") pod \"whisker-84f848d4b6-xtg49\" (UID: \"2b64553a-fefe-49c7-909d-c50ce828eef7\") " pod="calico-system/whisker-84f848d4b6-xtg49" May 27 17:43:14.873925 kubelet[2918]: I0527 17:43:14.873646 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4vm\" (UniqueName: \"kubernetes.io/projected/2b64553a-fefe-49c7-909d-c50ce828eef7-kube-api-access-2m4vm\") pod \"whisker-84f848d4b6-xtg49\" (UID: \"2b64553a-fefe-49c7-909d-c50ce828eef7\") " pod="calico-system/whisker-84f848d4b6-xtg49" May 27 17:43:14.943829 systemd-networkd[1518]: vxlan.calico: Link UP May 27 17:43:14.943834 systemd-networkd[1518]: vxlan.calico: Gained carrier May 27 17:43:15.022828 containerd[1632]: time="2025-05-27T17:43:15.022727712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84f848d4b6-xtg49,Uid:2b64553a-fefe-49c7-909d-c50ce828eef7,Namespace:calico-system,Attempt:0,}" May 27 17:43:15.026451 kubelet[2918]: I0527 17:43:15.026338 2918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a" path="/var/lib/kubelet/pods/80af4ccd-d2ea-4a9d-ba83-54dcaaf7032a/volumes" May 27 17:43:15.744377 systemd-networkd[1518]: cali18bae825f74: Link UP May 27 17:43:15.744800 systemd-networkd[1518]: cali18bae825f74: Gained carrier May 27 17:43:15.760036 containerd[1632]: 2025-05-27 17:43:15.150 [INFO][4230] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--84f848d4b6--xtg49-eth0 whisker-84f848d4b6- calico-system 2b64553a-fefe-49c7-909d-c50ce828eef7 893 0 2025-05-27 17:43:14 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:84f848d4b6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-84f848d4b6-xtg49 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali18bae825f74 [] [] }} ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Namespace="calico-system" Pod="whisker-84f848d4b6-xtg49" WorkloadEndpoint="localhost-k8s-whisker--84f848d4b6--xtg49-" May 27 17:43:15.760036 containerd[1632]: 2025-05-27 17:43:15.151 [INFO][4230] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Namespace="calico-system" Pod="whisker-84f848d4b6-xtg49" WorkloadEndpoint="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" May 27 17:43:15.760036 containerd[1632]: 2025-05-27 17:43:15.668 [INFO][4242] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" HandleID="k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Workload="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.672 [INFO][4242] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" HandleID="k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Workload="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fc10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-84f848d4b6-xtg49", "timestamp":"2025-05-27 17:43:15.668743578 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.672 [INFO][4242] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.672 [INFO][4242] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.672 [INFO][4242] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.685 [INFO][4242] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" host="localhost" May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.696 [INFO][4242] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.699 [INFO][4242] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.701 [INFO][4242] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.702 [INFO][4242] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:15.760187 containerd[1632]: 2025-05-27 17:43:15.702 [INFO][4242] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" host="localhost" May 27 17:43:15.760398 containerd[1632]: 2025-05-27 17:43:15.703 [INFO][4242] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de May 27 17:43:15.760398 containerd[1632]: 2025-05-27 17:43:15.718 [INFO][4242] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" host="localhost" May 27 17:43:15.760398 containerd[1632]: 2025-05-27 17:43:15.722 [INFO][4242] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" host="localhost" May 27 17:43:15.760398 containerd[1632]: 2025-05-27 17:43:15.722 [INFO][4242] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" host="localhost" May 27 17:43:15.760398 containerd[1632]: 2025-05-27 17:43:15.722 [INFO][4242] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:15.760398 containerd[1632]: 2025-05-27 17:43:15.722 [INFO][4242] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" HandleID="k8s-pod-network.3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Workload="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" May 27 17:43:15.760528 containerd[1632]: 2025-05-27 17:43:15.724 [INFO][4230] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Namespace="calico-system" Pod="whisker-84f848d4b6-xtg49" WorkloadEndpoint="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--84f848d4b6--xtg49-eth0", GenerateName:"whisker-84f848d4b6-", Namespace:"calico-system", SelfLink:"", UID:"2b64553a-fefe-49c7-909d-c50ce828eef7", ResourceVersion:"893", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 43, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"84f848d4b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-84f848d4b6-xtg49", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali18bae825f74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:15.760528 containerd[1632]: 2025-05-27 17:43:15.724 [INFO][4230] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Namespace="calico-system" Pod="whisker-84f848d4b6-xtg49" WorkloadEndpoint="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" May 27 17:43:15.762081 containerd[1632]: 2025-05-27 17:43:15.724 [INFO][4230] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali18bae825f74 ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Namespace="calico-system" Pod="whisker-84f848d4b6-xtg49" WorkloadEndpoint="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" May 27 17:43:15.762081 containerd[1632]: 2025-05-27 17:43:15.746 [INFO][4230] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Namespace="calico-system" Pod="whisker-84f848d4b6-xtg49" WorkloadEndpoint="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" May 27 17:43:15.762118 containerd[1632]: 2025-05-27 17:43:15.746 [INFO][4230] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Namespace="calico-system" Pod="whisker-84f848d4b6-xtg49" WorkloadEndpoint="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--84f848d4b6--xtg49-eth0", GenerateName:"whisker-84f848d4b6-", Namespace:"calico-system", SelfLink:"", UID:"2b64553a-fefe-49c7-909d-c50ce828eef7", ResourceVersion:"893", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 43, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"84f848d4b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de", Pod:"whisker-84f848d4b6-xtg49", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali18bae825f74", MAC:"0e:9c:c1:4d:40:15", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:15.763092 containerd[1632]: 2025-05-27 17:43:15.758 [INFO][4230] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" Namespace="calico-system" Pod="whisker-84f848d4b6-xtg49" WorkloadEndpoint="localhost-k8s-whisker--84f848d4b6--xtg49-eth0" May 27 17:43:15.862012 containerd[1632]: time="2025-05-27T17:43:15.861548601Z" level=info msg="connecting to shim 3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de" address="unix:///run/containerd/s/2e3cbb5174541cd147935e9aaf96231007c5bf6b4b59b4e906d559533bebd7ce" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:15.885386 systemd[1]: Started cri-containerd-3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de.scope - libcontainer container 3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de. May 27 17:43:15.903112 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:15.941040 containerd[1632]: time="2025-05-27T17:43:15.940954003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84f848d4b6-xtg49,Uid:2b64553a-fefe-49c7-909d-c50ce828eef7,Namespace:calico-system,Attempt:0,} returns sandbox id \"3d915d4c01824825483115cf4aa4b8d403939a77a4eb4ad154b0e8e9ccd622de\"" May 27 17:43:15.973580 containerd[1632]: time="2025-05-27T17:43:15.973522386Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:43:16.255105 containerd[1632]: time="2025-05-27T17:43:16.255068200Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:43:16.255722 containerd[1632]: time="2025-05-27T17:43:16.255701608Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:43:16.255784 containerd[1632]: time="2025-05-27T17:43:16.255748876Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:43:16.260560 kubelet[2918]: E0527 17:43:16.255886 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:43:16.260902 kubelet[2918]: E0527 17:43:16.260575 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:43:16.266036 kubelet[2918]: E0527 17:43:16.266005 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:e89948687a8847d995f11dcd4865307d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2m4vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-84f848d4b6-xtg49_calico-system(2b64553a-fefe-49c7-909d-c50ce828eef7): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:43:16.267582 containerd[1632]: time="2025-05-27T17:43:16.267560841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:43:16.500585 containerd[1632]: time="2025-05-27T17:43:16.500549234Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:43:16.501033 containerd[1632]: time="2025-05-27T17:43:16.501011992Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:43:16.501128 containerd[1632]: time="2025-05-27T17:43:16.501057850Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:43:16.501166 kubelet[2918]: E0527 17:43:16.501142 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:43:16.501211 kubelet[2918]: E0527 17:43:16.501174 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:43:16.501320 kubelet[2918]: E0527 17:43:16.501276 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m4vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-84f848d4b6-xtg49_calico-system(2b64553a-fefe-49c7-909d-c50ce828eef7): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:43:16.503131 kubelet[2918]: E0527 17:43:16.503064 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7" May 27 17:43:16.536143 systemd-networkd[1518]: vxlan.calico: Gained IPv6LL May 27 17:43:17.116333 kubelet[2918]: E0527 17:43:17.115870 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7" May 27 17:43:17.624129 systemd-networkd[1518]: cali18bae825f74: Gained IPv6LL May 27 17:43:17.982074 containerd[1632]: time="2025-05-27T17:43:17.981849923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8499647769-j5wlf,Uid:447bf712-e904-4fbf-928f-7f18d872264d,Namespace:calico-system,Attempt:0,}" May 27 17:43:18.065858 systemd-networkd[1518]: cali7d18d884ef5: Link UP May 27 17:43:18.066171 systemd-networkd[1518]: cali7d18d884ef5: Gained carrier May 27 17:43:18.080165 containerd[1632]: 2025-05-27 17:43:18.012 [INFO][4346] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0 calico-kube-controllers-8499647769- calico-system 447bf712-e904-4fbf-928f-7f18d872264d 814 0 2025-05-27 17:42:56 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8499647769 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-8499647769-j5wlf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali7d18d884ef5 [] [] }} ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Namespace="calico-system" Pod="calico-kube-controllers-8499647769-j5wlf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-" May 27 17:43:18.080165 containerd[1632]: 2025-05-27 17:43:18.013 [INFO][4346] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Namespace="calico-system" Pod="calico-kube-controllers-8499647769-j5wlf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" May 27 17:43:18.080165 containerd[1632]: 2025-05-27 17:43:18.040 [INFO][4358] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" HandleID="k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Workload="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.041 [INFO][4358] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" HandleID="k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Workload="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d92f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-8499647769-j5wlf", "timestamp":"2025-05-27 17:43:18.040935187 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.041 [INFO][4358] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.041 [INFO][4358] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.041 [INFO][4358] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.045 [INFO][4358] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" host="localhost" May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.047 [INFO][4358] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.050 [INFO][4358] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.051 [INFO][4358] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.052 [INFO][4358] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:18.080313 containerd[1632]: 2025-05-27 17:43:18.052 [INFO][4358] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" host="localhost" May 27 17:43:18.080634 containerd[1632]: 2025-05-27 17:43:18.053 [INFO][4358] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12 May 27 17:43:18.080634 containerd[1632]: 2025-05-27 17:43:18.056 [INFO][4358] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" host="localhost" May 27 17:43:18.080634 containerd[1632]: 2025-05-27 17:43:18.060 [INFO][4358] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" host="localhost" May 27 17:43:18.080634 containerd[1632]: 2025-05-27 17:43:18.060 [INFO][4358] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" host="localhost" May 27 17:43:18.080634 containerd[1632]: 2025-05-27 17:43:18.060 [INFO][4358] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:18.080634 containerd[1632]: 2025-05-27 17:43:18.060 [INFO][4358] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" HandleID="k8s-pod-network.6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Workload="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" May 27 17:43:18.081034 containerd[1632]: 2025-05-27 17:43:18.064 [INFO][4346] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Namespace="calico-system" Pod="calico-kube-controllers-8499647769-j5wlf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0", GenerateName:"calico-kube-controllers-8499647769-", Namespace:"calico-system", SelfLink:"", UID:"447bf712-e904-4fbf-928f-7f18d872264d", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8499647769", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-8499647769-j5wlf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7d18d884ef5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:18.081087 containerd[1632]: 2025-05-27 17:43:18.064 [INFO][4346] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Namespace="calico-system" Pod="calico-kube-controllers-8499647769-j5wlf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" May 27 17:43:18.081087 containerd[1632]: 2025-05-27 17:43:18.064 [INFO][4346] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7d18d884ef5 ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Namespace="calico-system" Pod="calico-kube-controllers-8499647769-j5wlf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" May 27 17:43:18.081087 containerd[1632]: 2025-05-27 17:43:18.068 [INFO][4346] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Namespace="calico-system" Pod="calico-kube-controllers-8499647769-j5wlf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" May 27 17:43:18.081144 containerd[1632]: 2025-05-27 17:43:18.068 [INFO][4346] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Namespace="calico-system" Pod="calico-kube-controllers-8499647769-j5wlf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0", GenerateName:"calico-kube-controllers-8499647769-", Namespace:"calico-system", SelfLink:"", UID:"447bf712-e904-4fbf-928f-7f18d872264d", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8499647769", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12", Pod:"calico-kube-controllers-8499647769-j5wlf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7d18d884ef5", MAC:"02:b7:a1:91:f5:40", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:18.081187 containerd[1632]: 2025-05-27 17:43:18.077 [INFO][4346] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" Namespace="calico-system" Pod="calico-kube-controllers-8499647769-j5wlf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--8499647769--j5wlf-eth0" May 27 17:43:18.100449 containerd[1632]: time="2025-05-27T17:43:18.100418886Z" level=info msg="connecting to shim 6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12" address="unix:///run/containerd/s/46e85ad17d3761bc91d15a93b472e2391ac05ccc532c29b22da8f476eb18e9dd" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:18.123213 systemd[1]: Started cri-containerd-6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12.scope - libcontainer container 6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12. May 27 17:43:18.131541 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:18.161418 containerd[1632]: time="2025-05-27T17:43:18.161389332Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8499647769-j5wlf,Uid:447bf712-e904-4fbf-928f-7f18d872264d,Namespace:calico-system,Attempt:0,} returns sandbox id \"6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12\"" May 27 17:43:18.162745 containerd[1632]: time="2025-05-27T17:43:18.162730028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 17:43:19.981968 containerd[1632]: time="2025-05-27T17:43:19.981932354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-bbx54,Uid:cf215349-0148-4cc0-8d29-84c0a329c3d7,Namespace:kube-system,Attempt:0,}" May 27 17:43:19.983322 containerd[1632]: time="2025-05-27T17:43:19.983307120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-vtwqz,Uid:f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d,Namespace:calico-system,Attempt:0,}" May 27 17:43:20.056908 systemd-networkd[1518]: cali7d18d884ef5: Gained IPv6LL May 27 17:43:20.154321 systemd-networkd[1518]: calicd5a79b7f4e: Link UP May 27 17:43:20.156101 systemd-networkd[1518]: calicd5a79b7f4e: Gained carrier May 27 17:43:20.178630 containerd[1632]: 2025-05-27 17:43:20.034 [INFO][4422] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0 goldmane-8f77d7b6c- calico-system f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d 816 0 2025-05-27 17:42:55 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:8f77d7b6c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-8f77d7b6c-vtwqz eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calicd5a79b7f4e [] [] }} ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Namespace="calico-system" Pod="goldmane-8f77d7b6c-vtwqz" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--vtwqz-" May 27 17:43:20.178630 containerd[1632]: 2025-05-27 17:43:20.036 [INFO][4422] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Namespace="calico-system" Pod="goldmane-8f77d7b6c-vtwqz" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" May 27 17:43:20.178630 containerd[1632]: 2025-05-27 17:43:20.079 [INFO][4444] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" HandleID="k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Workload="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.080 [INFO][4444] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" HandleID="k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Workload="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d3930), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-8f77d7b6c-vtwqz", "timestamp":"2025-05-27 17:43:20.079944349 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.080 [INFO][4444] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.080 [INFO][4444] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.080 [INFO][4444] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.092 [INFO][4444] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" host="localhost" May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.107 [INFO][4444] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.123 [INFO][4444] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.129 [INFO][4444] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.132 [INFO][4444] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:20.178911 containerd[1632]: 2025-05-27 17:43:20.132 [INFO][4444] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" host="localhost" May 27 17:43:20.180372 containerd[1632]: 2025-05-27 17:43:20.135 [INFO][4444] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80 May 27 17:43:20.180372 containerd[1632]: 2025-05-27 17:43:20.139 [INFO][4444] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" host="localhost" May 27 17:43:20.180372 containerd[1632]: 2025-05-27 17:43:20.143 [INFO][4444] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" host="localhost" May 27 17:43:20.180372 containerd[1632]: 2025-05-27 17:43:20.143 [INFO][4444] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" host="localhost" May 27 17:43:20.180372 containerd[1632]: 2025-05-27 17:43:20.143 [INFO][4444] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:20.180372 containerd[1632]: 2025-05-27 17:43:20.144 [INFO][4444] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" HandleID="k8s-pod-network.da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Workload="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" May 27 17:43:20.180597 containerd[1632]: 2025-05-27 17:43:20.148 [INFO][4422] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Namespace="calico-system" Pod="goldmane-8f77d7b6c-vtwqz" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-8f77d7b6c-vtwqz", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calicd5a79b7f4e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:20.180597 containerd[1632]: 2025-05-27 17:43:20.148 [INFO][4422] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Namespace="calico-system" Pod="goldmane-8f77d7b6c-vtwqz" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" May 27 17:43:20.181605 containerd[1632]: 2025-05-27 17:43:20.148 [INFO][4422] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicd5a79b7f4e ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Namespace="calico-system" Pod="goldmane-8f77d7b6c-vtwqz" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" May 27 17:43:20.181605 containerd[1632]: 2025-05-27 17:43:20.156 [INFO][4422] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Namespace="calico-system" Pod="goldmane-8f77d7b6c-vtwqz" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" May 27 17:43:20.181718 containerd[1632]: 2025-05-27 17:43:20.157 [INFO][4422] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Namespace="calico-system" Pod="goldmane-8f77d7b6c-vtwqz" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80", Pod:"goldmane-8f77d7b6c-vtwqz", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calicd5a79b7f4e", MAC:"92:b7:3e:b0:c6:31", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:20.181769 containerd[1632]: 2025-05-27 17:43:20.171 [INFO][4422] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" Namespace="calico-system" Pod="goldmane-8f77d7b6c-vtwqz" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--vtwqz-eth0" May 27 17:43:20.233773 containerd[1632]: time="2025-05-27T17:43:20.233697182Z" level=info msg="connecting to shim da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80" address="unix:///run/containerd/s/62b6fe1f12b9284d59e57ae7ffba85fec9dd5f8fbde5823c7b816e7a289c4513" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:20.272179 systemd[1]: Started cri-containerd-da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80.scope - libcontainer container da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80. May 27 17:43:20.277287 systemd-networkd[1518]: cali53d2da878fd: Link UP May 27 17:43:20.278017 systemd-networkd[1518]: cali53d2da878fd: Gained carrier May 27 17:43:20.296167 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:20.305582 containerd[1632]: 2025-05-27 17:43:20.034 [INFO][4421] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0 coredns-7c65d6cfc9- kube-system cf215349-0148-4cc0-8d29-84c0a329c3d7 810 0 2025-05-27 17:42:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-bbx54 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali53d2da878fd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bbx54" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--bbx54-" May 27 17:43:20.305582 containerd[1632]: 2025-05-27 17:43:20.035 [INFO][4421] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bbx54" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" May 27 17:43:20.305582 containerd[1632]: 2025-05-27 17:43:20.103 [INFO][4446] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" HandleID="k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Workload="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.103 [INFO][4446] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" HandleID="k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Workload="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e830), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-bbx54", "timestamp":"2025-05-27 17:43:20.103459268 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.103 [INFO][4446] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.143 [INFO][4446] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.144 [INFO][4446] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.191 [INFO][4446] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" host="localhost" May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.207 [INFO][4446] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.229 [INFO][4446] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.236 [INFO][4446] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.239 [INFO][4446] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:20.305729 containerd[1632]: 2025-05-27 17:43:20.239 [INFO][4446] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" host="localhost" May 27 17:43:20.315414 containerd[1632]: 2025-05-27 17:43:20.243 [INFO][4446] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351 May 27 17:43:20.315414 containerd[1632]: 2025-05-27 17:43:20.257 [INFO][4446] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" host="localhost" May 27 17:43:20.315414 containerd[1632]: 2025-05-27 17:43:20.268 [INFO][4446] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" host="localhost" May 27 17:43:20.315414 containerd[1632]: 2025-05-27 17:43:20.269 [INFO][4446] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" host="localhost" May 27 17:43:20.315414 containerd[1632]: 2025-05-27 17:43:20.269 [INFO][4446] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:20.315414 containerd[1632]: 2025-05-27 17:43:20.269 [INFO][4446] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" HandleID="k8s-pod-network.a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Workload="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" May 27 17:43:20.315536 containerd[1632]: 2025-05-27 17:43:20.273 [INFO][4421] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bbx54" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"cf215349-0148-4cc0-8d29-84c0a329c3d7", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-bbx54", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali53d2da878fd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:20.316082 containerd[1632]: 2025-05-27 17:43:20.273 [INFO][4421] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bbx54" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" May 27 17:43:20.316082 containerd[1632]: 2025-05-27 17:43:20.273 [INFO][4421] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali53d2da878fd ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bbx54" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" May 27 17:43:20.316082 containerd[1632]: 2025-05-27 17:43:20.278 [INFO][4421] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bbx54" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" May 27 17:43:20.316144 containerd[1632]: 2025-05-27 17:43:20.278 [INFO][4421] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bbx54" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"cf215349-0148-4cc0-8d29-84c0a329c3d7", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351", Pod:"coredns-7c65d6cfc9-bbx54", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali53d2da878fd", MAC:"e2:78:cf:d7:93:c9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:20.316144 containerd[1632]: 2025-05-27 17:43:20.298 [INFO][4421] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bbx54" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--bbx54-eth0" May 27 17:43:20.360985 containerd[1632]: time="2025-05-27T17:43:20.360935638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-vtwqz,Uid:f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d,Namespace:calico-system,Attempt:0,} returns sandbox id \"da7dac5002e23f4e926cfb7405445530452aa05276d1154cd20f1f2f37a5da80\"" May 27 17:43:20.409396 containerd[1632]: time="2025-05-27T17:43:20.409361562Z" level=info msg="connecting to shim a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351" address="unix:///run/containerd/s/368710081e0cfc3d189a1fc939fda0ab661104fcce51c0d67e2b9c2d9ef64915" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:20.438181 systemd[1]: Started cri-containerd-a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351.scope - libcontainer container a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351. May 27 17:43:20.450912 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:20.979528 containerd[1632]: time="2025-05-27T17:43:20.979499550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-bbx54,Uid:cf215349-0148-4cc0-8d29-84c0a329c3d7,Namespace:kube-system,Attempt:0,} returns sandbox id \"a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351\"" May 27 17:43:20.982006 containerd[1632]: time="2025-05-27T17:43:20.981950065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-657c668ffb-m5qhs,Uid:1db7bd7d-cd20-46eb-a3b9-8fab3073829e,Namespace:calico-apiserver,Attempt:0,}" May 27 17:43:20.982494 containerd[1632]: time="2025-05-27T17:43:20.982382017Z" level=info msg="CreateContainer within sandbox \"a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:43:20.982937 containerd[1632]: time="2025-05-27T17:43:20.982860856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7965f8d8cd-mnv87,Uid:736fe7a0-4e4b-481d-b725-c57000011d7a,Namespace:calico-apiserver,Attempt:0,}" May 27 17:43:21.464099 systemd-networkd[1518]: calicd5a79b7f4e: Gained IPv6LL May 27 17:43:21.650033 containerd[1632]: time="2025-05-27T17:43:21.649770484Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:21.768343 containerd[1632]: time="2025-05-27T17:43:21.768310590Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 27 17:43:21.811244 containerd[1632]: time="2025-05-27T17:43:21.811157077Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:21.912239 systemd-networkd[1518]: cali53d2da878fd: Gained IPv6LL May 27 17:43:21.982497 containerd[1632]: time="2025-05-27T17:43:21.982475519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dhvg9,Uid:2e20f778-9dbd-4e58-98d4-243c667b05e3,Namespace:calico-system,Attempt:0,}" May 27 17:43:22.000125 containerd[1632]: time="2025-05-27T17:43:21.982852357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-657c668ffb-nhkv6,Uid:82fd1cbc-6be0-4a3a-8294-e11be1d04cd9,Namespace:calico-apiserver,Attempt:0,}" May 27 17:43:22.000125 containerd[1632]: time="2025-05-27T17:43:21.997855005Z" level=info msg="Container ca02f8806a6860c8e22cec98007aa58f6cd9e451b27f56adacfeb8bb394d897e: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:22.000125 containerd[1632]: time="2025-05-27T17:43:21.999873862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2tczc,Uid:25db2a47-8658-4006-a22d-dc476e7c2f7b,Namespace:kube-system,Attempt:0,}" May 27 17:43:21.985706 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1133456000.mount: Deactivated successfully. May 27 17:43:21.994631 systemd-networkd[1518]: cali100d3588ff5: Link UP May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.856 [INFO][4581] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0 calico-apiserver-7965f8d8cd- calico-apiserver 736fe7a0-4e4b-481d-b725-c57000011d7a 812 0 2025-05-27 17:42:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7965f8d8cd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7965f8d8cd-mnv87 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali100d3588ff5 [] [] }} ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-mnv87" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.862 [INFO][4581] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-mnv87" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.945 [INFO][4602] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" HandleID="k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Workload="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.945 [INFO][4602] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" HandleID="k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Workload="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c0fa0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7965f8d8cd-mnv87", "timestamp":"2025-05-27 17:43:21.945036206 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.945 [INFO][4602] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.945 [INFO][4602] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.945 [INFO][4602] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.949 [INFO][4602] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.952 [INFO][4602] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.954 [INFO][4602] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.955 [INFO][4602] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.957 [INFO][4602] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.957 [INFO][4602] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.958 [INFO][4602] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.973 [INFO][4602] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.983 [INFO][4602] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.983 [INFO][4602] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" host="localhost" May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.983 [INFO][4602] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:22.023597 containerd[1632]: 2025-05-27 17:43:21.983 [INFO][4602] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" HandleID="k8s-pod-network.27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Workload="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" May 27 17:43:21.994732 systemd-networkd[1518]: cali100d3588ff5: Gained carrier May 27 17:43:22.036789 containerd[1632]: 2025-05-27 17:43:21.989 [INFO][4581] cni-plugin/k8s.go 418: Populated endpoint ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-mnv87" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0", GenerateName:"calico-apiserver-7965f8d8cd-", Namespace:"calico-apiserver", SelfLink:"", UID:"736fe7a0-4e4b-481d-b725-c57000011d7a", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7965f8d8cd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7965f8d8cd-mnv87", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali100d3588ff5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.036789 containerd[1632]: 2025-05-27 17:43:21.990 [INFO][4581] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-mnv87" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" May 27 17:43:22.036789 containerd[1632]: 2025-05-27 17:43:21.990 [INFO][4581] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali100d3588ff5 ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-mnv87" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" May 27 17:43:22.036789 containerd[1632]: 2025-05-27 17:43:21.993 [INFO][4581] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-mnv87" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" May 27 17:43:22.036789 containerd[1632]: 2025-05-27 17:43:21.994 [INFO][4581] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-mnv87" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0", GenerateName:"calico-apiserver-7965f8d8cd-", Namespace:"calico-apiserver", SelfLink:"", UID:"736fe7a0-4e4b-481d-b725-c57000011d7a", ResourceVersion:"812", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7965f8d8cd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed", Pod:"calico-apiserver-7965f8d8cd-mnv87", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali100d3588ff5", MAC:"5e:b3:db:29:d1:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.036789 containerd[1632]: 2025-05-27 17:43:22.013 [INFO][4581] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-mnv87" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--mnv87-eth0" May 27 17:43:21.997367 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3791646308.mount: Deactivated successfully. May 27 17:43:22.083578 systemd-networkd[1518]: cali2ab93dc9b9c: Link UP May 27 17:43:22.084400 systemd-networkd[1518]: cali2ab93dc9b9c: Gained carrier May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:21.856 [INFO][4575] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0 calico-apiserver-657c668ffb- calico-apiserver 1db7bd7d-cd20-46eb-a3b9-8fab3073829e 811 0 2025-05-27 17:42:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:657c668ffb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-657c668ffb-m5qhs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2ab93dc9b9c [] [] }} ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-m5qhs" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:21.863 [INFO][4575] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-m5qhs" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:21.958 [INFO][4604] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:21.959 [INFO][4604] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002a90c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-657c668ffb-m5qhs", "timestamp":"2025-05-27 17:43:21.958968077 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:21.959 [INFO][4604] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:21.983 [INFO][4604] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:21.983 [INFO][4604] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.050 [INFO][4604] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.053 [INFO][4604] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.055 [INFO][4604] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.057 [INFO][4604] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.059 [INFO][4604] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.059 [INFO][4604] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.060 [INFO][4604] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.065 [INFO][4604] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.079 [INFO][4604] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.080 [INFO][4604] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" host="localhost" May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.080 [INFO][4604] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:22.099904 containerd[1632]: 2025-05-27 17:43:22.080 [INFO][4604] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:22.105327 containerd[1632]: 2025-05-27 17:43:22.081 [INFO][4575] cni-plugin/k8s.go 418: Populated endpoint ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-m5qhs" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0", GenerateName:"calico-apiserver-657c668ffb-", Namespace:"calico-apiserver", SelfLink:"", UID:"1db7bd7d-cd20-46eb-a3b9-8fab3073829e", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"657c668ffb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-657c668ffb-m5qhs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2ab93dc9b9c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.105327 containerd[1632]: 2025-05-27 17:43:22.081 [INFO][4575] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-m5qhs" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:22.105327 containerd[1632]: 2025-05-27 17:43:22.081 [INFO][4575] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2ab93dc9b9c ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-m5qhs" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:22.105327 containerd[1632]: 2025-05-27 17:43:22.083 [INFO][4575] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-m5qhs" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:22.105327 containerd[1632]: 2025-05-27 17:43:22.084 [INFO][4575] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-m5qhs" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0", GenerateName:"calico-apiserver-657c668ffb-", Namespace:"calico-apiserver", SelfLink:"", UID:"1db7bd7d-cd20-46eb-a3b9-8fab3073829e", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"657c668ffb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca", Pod:"calico-apiserver-657c668ffb-m5qhs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2ab93dc9b9c", MAC:"fe:36:62:24:ec:6c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.105327 containerd[1632]: 2025-05-27 17:43:22.098 [INFO][4575] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-m5qhs" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:22.223501 containerd[1632]: time="2025-05-27T17:43:22.223459331Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:22.245903 containerd[1632]: time="2025-05-27T17:43:22.245815979Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 4.083068754s" May 27 17:43:22.245903 containerd[1632]: time="2025-05-27T17:43:22.245844274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 27 17:43:22.246694 containerd[1632]: time="2025-05-27T17:43:22.246614681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:43:22.268894 containerd[1632]: time="2025-05-27T17:43:22.268864409Z" level=info msg="CreateContainer within sandbox \"6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 17:43:22.332872 containerd[1632]: time="2025-05-27T17:43:22.332787969Z" level=info msg="CreateContainer within sandbox \"a4eff5e234cb9d637a6e120c7371d6583b9d830e4440e6abfd50ef7d9073b351\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ca02f8806a6860c8e22cec98007aa58f6cd9e451b27f56adacfeb8bb394d897e\"" May 27 17:43:22.335762 containerd[1632]: time="2025-05-27T17:43:22.335091448Z" level=info msg="StartContainer for \"ca02f8806a6860c8e22cec98007aa58f6cd9e451b27f56adacfeb8bb394d897e\"" May 27 17:43:22.336420 containerd[1632]: time="2025-05-27T17:43:22.336402951Z" level=info msg="connecting to shim ca02f8806a6860c8e22cec98007aa58f6cd9e451b27f56adacfeb8bb394d897e" address="unix:///run/containerd/s/368710081e0cfc3d189a1fc939fda0ab661104fcce51c0d67e2b9c2d9ef64915" protocol=ttrpc version=3 May 27 17:43:22.340420 containerd[1632]: time="2025-05-27T17:43:22.340388410Z" level=info msg="Container ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:22.350454 containerd[1632]: time="2025-05-27T17:43:22.350423313Z" level=info msg="CreateContainer within sandbox \"6c7fb3da7a4a99e8859aca1582084a513abc0dcdd5baaf07192cd56d9db68f12\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d\"" May 27 17:43:22.351960 containerd[1632]: time="2025-05-27T17:43:22.351212982Z" level=info msg="StartContainer for \"ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d\"" May 27 17:43:22.358670 containerd[1632]: time="2025-05-27T17:43:22.358624500Z" level=info msg="connecting to shim ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d" address="unix:///run/containerd/s/46e85ad17d3761bc91d15a93b472e2391ac05ccc532c29b22da8f476eb18e9dd" protocol=ttrpc version=3 May 27 17:43:22.383012 containerd[1632]: time="2025-05-27T17:43:22.382112939Z" level=info msg="connecting to shim 94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" address="unix:///run/containerd/s/5d36c9bf69fb7d8fca9dc604312f4450ba3ecace46ebbfeab8e8332ccfe555b0" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:22.383716 systemd[1]: Started cri-containerd-ca02f8806a6860c8e22cec98007aa58f6cd9e451b27f56adacfeb8bb394d897e.scope - libcontainer container ca02f8806a6860c8e22cec98007aa58f6cd9e451b27f56adacfeb8bb394d897e. May 27 17:43:22.393934 containerd[1632]: time="2025-05-27T17:43:22.393788967Z" level=info msg="connecting to shim 27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed" address="unix:///run/containerd/s/277aec4fc1d29364d79dbe691b2a2848e2a21449f5583410b8a34522d1bb6c89" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:22.431326 systemd[1]: Started cri-containerd-ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d.scope - libcontainer container ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d. May 27 17:43:22.458296 systemd[1]: Started cri-containerd-27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed.scope - libcontainer container 27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed. May 27 17:43:22.478040 systemd[1]: Started cri-containerd-94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca.scope - libcontainer container 94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca. May 27 17:43:22.498782 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:22.535796 systemd-networkd[1518]: cali8b3bbb092d8: Link UP May 27 17:43:22.538578 systemd-networkd[1518]: cali8b3bbb092d8: Gained carrier May 27 17:43:22.562537 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:22.568361 containerd[1632]: time="2025-05-27T17:43:22.566882499Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:43:22.575701 containerd[1632]: time="2025-05-27T17:43:22.575125488Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:43:22.579043 containerd[1632]: time="2025-05-27T17:43:22.578522822Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:43:22.580320 containerd[1632]: time="2025-05-27T17:43:22.580296256Z" level=info msg="StartContainer for \"ca02f8806a6860c8e22cec98007aa58f6cd9e451b27f56adacfeb8bb394d897e\" returns successfully" May 27 17:43:22.583064 containerd[1632]: time="2025-05-27T17:43:22.582954727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7965f8d8cd-mnv87,Uid:736fe7a0-4e4b-481d-b725-c57000011d7a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed\"" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.369 [INFO][4636] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--dhvg9-eth0 csi-node-driver- calico-system 2e20f778-9dbd-4e58-98d4-243c667b05e3 688 0 2025-05-27 17:42:56 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:68bf44dd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-dhvg9 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8b3bbb092d8 [] [] }} ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Namespace="calico-system" Pod="csi-node-driver-dhvg9" WorkloadEndpoint="localhost-k8s-csi--node--driver--dhvg9-" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.370 [INFO][4636] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Namespace="calico-system" Pod="csi-node-driver-dhvg9" WorkloadEndpoint="localhost-k8s-csi--node--driver--dhvg9-eth0" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.439 [INFO][4702] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" HandleID="k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Workload="localhost-k8s-csi--node--driver--dhvg9-eth0" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.439 [INFO][4702] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" HandleID="k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Workload="localhost-k8s-csi--node--driver--dhvg9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e3050), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-dhvg9", "timestamp":"2025-05-27 17:43:22.4390559 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.440 [INFO][4702] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.440 [INFO][4702] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.441 [INFO][4702] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.457 [INFO][4702] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.471 [INFO][4702] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.481 [INFO][4702] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.484 [INFO][4702] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.491 [INFO][4702] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.491 [INFO][4702] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.493 [INFO][4702] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6 May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.505 [INFO][4702] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.524 [INFO][4702] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.524 [INFO][4702] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" host="localhost" May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.524 [INFO][4702] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:22.586648 containerd[1632]: 2025-05-27 17:43:22.524 [INFO][4702] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" HandleID="k8s-pod-network.b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Workload="localhost-k8s-csi--node--driver--dhvg9-eth0" May 27 17:43:22.588319 containerd[1632]: 2025-05-27 17:43:22.529 [INFO][4636] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Namespace="calico-system" Pod="csi-node-driver-dhvg9" WorkloadEndpoint="localhost-k8s-csi--node--driver--dhvg9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--dhvg9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2e20f778-9dbd-4e58-98d4-243c667b05e3", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-dhvg9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8b3bbb092d8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.588319 containerd[1632]: 2025-05-27 17:43:22.529 [INFO][4636] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Namespace="calico-system" Pod="csi-node-driver-dhvg9" WorkloadEndpoint="localhost-k8s-csi--node--driver--dhvg9-eth0" May 27 17:43:22.588319 containerd[1632]: 2025-05-27 17:43:22.529 [INFO][4636] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8b3bbb092d8 ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Namespace="calico-system" Pod="csi-node-driver-dhvg9" WorkloadEndpoint="localhost-k8s-csi--node--driver--dhvg9-eth0" May 27 17:43:22.588319 containerd[1632]: 2025-05-27 17:43:22.544 [INFO][4636] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Namespace="calico-system" Pod="csi-node-driver-dhvg9" WorkloadEndpoint="localhost-k8s-csi--node--driver--dhvg9-eth0" May 27 17:43:22.588319 containerd[1632]: 2025-05-27 17:43:22.546 [INFO][4636] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Namespace="calico-system" Pod="csi-node-driver-dhvg9" WorkloadEndpoint="localhost-k8s-csi--node--driver--dhvg9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--dhvg9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2e20f778-9dbd-4e58-98d4-243c667b05e3", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6", Pod:"csi-node-driver-dhvg9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8b3bbb092d8", MAC:"a6:56:d5:31:8e:3c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.588319 containerd[1632]: 2025-05-27 17:43:22.575 [INFO][4636] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" Namespace="calico-system" Pod="csi-node-driver-dhvg9" WorkloadEndpoint="localhost-k8s-csi--node--driver--dhvg9-eth0" May 27 17:43:22.609181 kubelet[2918]: E0527 17:43:22.609146 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:43:22.609458 kubelet[2918]: E0527 17:43:22.609189 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:43:22.612423 kubelet[2918]: E0527 17:43:22.612302 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlq8p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-vtwqz_calico-system(f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:43:22.617005 containerd[1632]: time="2025-05-27T17:43:22.616517828Z" level=info msg="StartContainer for \"ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d\" returns successfully" May 27 17:43:22.620014 kubelet[2918]: E0527 17:43:22.619120 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:43:22.620589 containerd[1632]: time="2025-05-27T17:43:22.620535941Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:43:22.644834 systemd-networkd[1518]: cali76799b25f8f: Link UP May 27 17:43:22.647026 systemd-networkd[1518]: cali76799b25f8f: Gained carrier May 27 17:43:22.678128 containerd[1632]: time="2025-05-27T17:43:22.677720601Z" level=info msg="connecting to shim b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6" address="unix:///run/containerd/s/a523145612eb5edfe48abf0db3e4731a06eb0fcc26113a1e450cafff0f8aaeda" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.402 [INFO][4655] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0 coredns-7c65d6cfc9- kube-system 25db2a47-8658-4006-a22d-dc476e7c2f7b 815 0 2025-05-27 17:42:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-2tczc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali76799b25f8f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2tczc" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2tczc-" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.402 [INFO][4655] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2tczc" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.492 [INFO][4736] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" HandleID="k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Workload="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.494 [INFO][4736] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" HandleID="k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Workload="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00023b940), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-2tczc", "timestamp":"2025-05-27 17:43:22.492815961 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.494 [INFO][4736] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.524 [INFO][4736] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.524 [INFO][4736] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.556 [INFO][4736] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.573 [INFO][4736] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.586 [INFO][4736] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.595 [INFO][4736] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.604 [INFO][4736] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.604 [INFO][4736] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.610 [INFO][4736] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69 May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.616 [INFO][4736] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.627 [INFO][4736] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.627 [INFO][4736] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" host="localhost" May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.627 [INFO][4736] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:22.678708 containerd[1632]: 2025-05-27 17:43:22.629 [INFO][4736] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" HandleID="k8s-pod-network.e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Workload="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" May 27 17:43:22.680716 containerd[1632]: 2025-05-27 17:43:22.638 [INFO][4655] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2tczc" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"25db2a47-8658-4006-a22d-dc476e7c2f7b", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-2tczc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali76799b25f8f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.680716 containerd[1632]: 2025-05-27 17:43:22.638 [INFO][4655] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2tczc" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" May 27 17:43:22.680716 containerd[1632]: 2025-05-27 17:43:22.638 [INFO][4655] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76799b25f8f ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2tczc" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" May 27 17:43:22.680716 containerd[1632]: 2025-05-27 17:43:22.659 [INFO][4655] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2tczc" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" May 27 17:43:22.680716 containerd[1632]: 2025-05-27 17:43:22.661 [INFO][4655] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2tczc" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"25db2a47-8658-4006-a22d-dc476e7c2f7b", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69", Pod:"coredns-7c65d6cfc9-2tczc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali76799b25f8f", MAC:"f6:0b:9c:5a:82:8e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.680716 containerd[1632]: 2025-05-27 17:43:22.673 [INFO][4655] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" Namespace="kube-system" Pod="coredns-7c65d6cfc9-2tczc" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--2tczc-eth0" May 27 17:43:22.712636 systemd[1]: Started cri-containerd-b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6.scope - libcontainer container b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6. May 27 17:43:22.723097 containerd[1632]: time="2025-05-27T17:43:22.721780096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-657c668ffb-m5qhs,Uid:1db7bd7d-cd20-46eb-a3b9-8fab3073829e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\"" May 27 17:43:22.735543 systemd-networkd[1518]: cali5eef2231ecc: Link UP May 27 17:43:22.739228 systemd-networkd[1518]: cali5eef2231ecc: Gained carrier May 27 17:43:22.761148 containerd[1632]: time="2025-05-27T17:43:22.761112081Z" level=info msg="connecting to shim e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69" address="unix:///run/containerd/s/2e4c4b17e4bc7dbe0ced5206bbbbe6ed7501bc379129311f81604edf7f6aa156" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.405 [INFO][4643] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0 calico-apiserver-657c668ffb- calico-apiserver 82fd1cbc-6be0-4a3a-8294-e11be1d04cd9 813 0 2025-05-27 17:42:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:657c668ffb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-657c668ffb-nhkv6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5eef2231ecc [] [] }} ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-nhkv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.406 [INFO][4643] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-nhkv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.544 [INFO][4742] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.544 [INFO][4742] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fa60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-657c668ffb-nhkv6", "timestamp":"2025-05-27 17:43:22.544618735 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.544 [INFO][4742] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.627 [INFO][4742] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.627 [INFO][4742] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.662 [INFO][4742] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.669 [INFO][4742] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.682 [INFO][4742] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.691 [INFO][4742] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.700 [INFO][4742] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.700 [INFO][4742] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.704 [INFO][4742] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.710 [INFO][4742] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.724 [INFO][4742] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.724 [INFO][4742] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" host="localhost" May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.724 [INFO][4742] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:22.774661 containerd[1632]: 2025-05-27 17:43:22.725 [INFO][4742] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:22.776739 containerd[1632]: 2025-05-27 17:43:22.727 [INFO][4643] cni-plugin/k8s.go 418: Populated endpoint ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-nhkv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0", GenerateName:"calico-apiserver-657c668ffb-", Namespace:"calico-apiserver", SelfLink:"", UID:"82fd1cbc-6be0-4a3a-8294-e11be1d04cd9", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"657c668ffb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-657c668ffb-nhkv6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5eef2231ecc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.776739 containerd[1632]: 2025-05-27 17:43:22.728 [INFO][4643] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-nhkv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:22.776739 containerd[1632]: 2025-05-27 17:43:22.728 [INFO][4643] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5eef2231ecc ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-nhkv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:22.776739 containerd[1632]: 2025-05-27 17:43:22.743 [INFO][4643] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-nhkv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:22.776739 containerd[1632]: 2025-05-27 17:43:22.744 [INFO][4643] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-nhkv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0", GenerateName:"calico-apiserver-657c668ffb-", Namespace:"calico-apiserver", SelfLink:"", UID:"82fd1cbc-6be0-4a3a-8294-e11be1d04cd9", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 42, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"657c668ffb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd", Pod:"calico-apiserver-657c668ffb-nhkv6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5eef2231ecc", MAC:"5e:7e:41:ce:32:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:22.776739 containerd[1632]: 2025-05-27 17:43:22.767 [INFO][4643] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Namespace="calico-apiserver" Pod="calico-apiserver-657c668ffb-nhkv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:22.788045 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:22.797222 systemd[1]: Started cri-containerd-e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69.scope - libcontainer container e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69. May 27 17:43:22.811884 containerd[1632]: time="2025-05-27T17:43:22.811857147Z" level=info msg="connecting to shim 85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" address="unix:///run/containerd/s/d1233ef0c517a062dc9a775cf5f08f854adff9ce7c445d030b8227919c2dc5e3" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:22.842188 systemd[1]: Started cri-containerd-85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd.scope - libcontainer container 85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd. May 27 17:43:22.846755 containerd[1632]: time="2025-05-27T17:43:22.846730187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dhvg9,Uid:2e20f778-9dbd-4e58-98d4-243c667b05e3,Namespace:calico-system,Attempt:0,} returns sandbox id \"b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6\"" May 27 17:43:22.849631 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:22.862752 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:22.891667 containerd[1632]: time="2025-05-27T17:43:22.891453978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-2tczc,Uid:25db2a47-8658-4006-a22d-dc476e7c2f7b,Namespace:kube-system,Attempt:0,} returns sandbox id \"e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69\"" May 27 17:43:22.894841 containerd[1632]: time="2025-05-27T17:43:22.894812718Z" level=info msg="CreateContainer within sandbox \"e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:43:22.900628 containerd[1632]: time="2025-05-27T17:43:22.900562690Z" level=info msg="Container 89f2de300b2af9c30e7d688c326142941f6d3a62811cabf19ad2d45044c13ca7: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:22.913442 containerd[1632]: time="2025-05-27T17:43:22.913412141Z" level=info msg="CreateContainer within sandbox \"e9a79f7d847ae41b685f37fbeaed5f2ecc5cd749c8ad678f5f44075f1450bb69\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"89f2de300b2af9c30e7d688c326142941f6d3a62811cabf19ad2d45044c13ca7\"" May 27 17:43:22.914332 containerd[1632]: time="2025-05-27T17:43:22.914320195Z" level=info msg="StartContainer for \"89f2de300b2af9c30e7d688c326142941f6d3a62811cabf19ad2d45044c13ca7\"" May 27 17:43:22.918066 containerd[1632]: time="2025-05-27T17:43:22.918046536Z" level=info msg="connecting to shim 89f2de300b2af9c30e7d688c326142941f6d3a62811cabf19ad2d45044c13ca7" address="unix:///run/containerd/s/2e4c4b17e4bc7dbe0ced5206bbbbe6ed7501bc379129311f81604edf7f6aa156" protocol=ttrpc version=3 May 27 17:43:22.926249 containerd[1632]: time="2025-05-27T17:43:22.925825474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-657c668ffb-nhkv6,Uid:82fd1cbc-6be0-4a3a-8294-e11be1d04cd9,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\"" May 27 17:43:22.941166 systemd[1]: Started cri-containerd-89f2de300b2af9c30e7d688c326142941f6d3a62811cabf19ad2d45044c13ca7.scope - libcontainer container 89f2de300b2af9c30e7d688c326142941f6d3a62811cabf19ad2d45044c13ca7. May 27 17:43:22.963349 containerd[1632]: time="2025-05-27T17:43:22.963316816Z" level=info msg="StartContainer for \"89f2de300b2af9c30e7d688c326142941f6d3a62811cabf19ad2d45044c13ca7\" returns successfully" May 27 17:43:23.190729 kubelet[2918]: E0527 17:43:23.189595 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:43:23.225394 kubelet[2918]: I0527 17:43:23.223294 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-2tczc" podStartSLOduration=39.217047227 podStartE2EDuration="39.217047227s" podCreationTimestamp="2025-05-27 17:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:43:23.20708746 +0000 UTC m=+44.368578596" watchObservedRunningTime="2025-05-27 17:43:23.217047227 +0000 UTC m=+44.378538372" May 27 17:43:23.232368 kubelet[2918]: I0527 17:43:23.232167 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-bbx54" podStartSLOduration=39.232154965 podStartE2EDuration="39.232154965s" podCreationTimestamp="2025-05-27 17:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:43:23.23158304 +0000 UTC m=+44.393074185" watchObservedRunningTime="2025-05-27 17:43:23.232154965 +0000 UTC m=+44.393646106" May 27 17:43:23.251738 kubelet[2918]: I0527 17:43:23.251093 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8499647769-j5wlf" podStartSLOduration=23.166947713 podStartE2EDuration="27.251082347s" podCreationTimestamp="2025-05-27 17:42:56 +0000 UTC" firstStartedPulling="2025-05-27 17:43:18.162286302 +0000 UTC m=+39.323777439" lastFinishedPulling="2025-05-27 17:43:22.246420935 +0000 UTC m=+43.407912073" observedRunningTime="2025-05-27 17:43:23.243395966 +0000 UTC m=+44.404887112" watchObservedRunningTime="2025-05-27 17:43:23.251082347 +0000 UTC m=+44.412573493" May 27 17:43:23.267829 containerd[1632]: time="2025-05-27T17:43:23.265120148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d\" id:\"afeb9d47036576843ef28e6a59abe7f0fc6d63dedca117893ad0118d2a1e54ca\" pid:5073 exited_at:{seconds:1748367803 nanos:264946378}" May 27 17:43:23.576121 systemd-networkd[1518]: cali2ab93dc9b9c: Gained IPv6LL May 27 17:43:23.960115 systemd-networkd[1518]: cali100d3588ff5: Gained IPv6LL May 27 17:43:24.024397 systemd-networkd[1518]: cali8b3bbb092d8: Gained IPv6LL May 27 17:43:24.344112 systemd-networkd[1518]: cali76799b25f8f: Gained IPv6LL May 27 17:43:24.600146 systemd-networkd[1518]: cali5eef2231ecc: Gained IPv6LL May 27 17:43:26.947499 containerd[1632]: time="2025-05-27T17:43:26.947061475Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:26.952331 containerd[1632]: time="2025-05-27T17:43:26.952311057Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 27 17:43:26.959306 containerd[1632]: time="2025-05-27T17:43:26.959276065Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:26.972675 containerd[1632]: time="2025-05-27T17:43:26.972614910Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:26.973006 containerd[1632]: time="2025-05-27T17:43:26.972925909Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 4.352367288s" May 27 17:43:26.973006 containerd[1632]: time="2025-05-27T17:43:26.972946026Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 17:43:26.973932 containerd[1632]: time="2025-05-27T17:43:26.973909500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:43:27.022106 containerd[1632]: time="2025-05-27T17:43:27.022067365Z" level=info msg="CreateContainer within sandbox \"27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:43:27.136532 containerd[1632]: time="2025-05-27T17:43:27.136099491Z" level=info msg="Container a5e694f5ad9494a47173ad651e745a16096ae43a006c4f356e52120c627473c6: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:27.141551 containerd[1632]: time="2025-05-27T17:43:27.141522321Z" level=info msg="CreateContainer within sandbox \"27aab2bd0bb6e657ee60cc89c31aeda80a072ddd97b2f6ff6b20983db79179ed\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a5e694f5ad9494a47173ad651e745a16096ae43a006c4f356e52120c627473c6\"" May 27 17:43:27.143330 containerd[1632]: time="2025-05-27T17:43:27.142773349Z" level=info msg="StartContainer for \"a5e694f5ad9494a47173ad651e745a16096ae43a006c4f356e52120c627473c6\"" May 27 17:43:27.143825 containerd[1632]: time="2025-05-27T17:43:27.143805443Z" level=info msg="connecting to shim a5e694f5ad9494a47173ad651e745a16096ae43a006c4f356e52120c627473c6" address="unix:///run/containerd/s/277aec4fc1d29364d79dbe691b2a2848e2a21449f5583410b8a34522d1bb6c89" protocol=ttrpc version=3 May 27 17:43:27.166258 systemd[1]: Started cri-containerd-a5e694f5ad9494a47173ad651e745a16096ae43a006c4f356e52120c627473c6.scope - libcontainer container a5e694f5ad9494a47173ad651e745a16096ae43a006c4f356e52120c627473c6. May 27 17:43:27.218493 containerd[1632]: time="2025-05-27T17:43:27.218432995Z" level=info msg="StartContainer for \"a5e694f5ad9494a47173ad651e745a16096ae43a006c4f356e52120c627473c6\" returns successfully" May 27 17:43:27.788610 containerd[1632]: time="2025-05-27T17:43:27.788381763Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:27.799901 containerd[1632]: time="2025-05-27T17:43:27.799872971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 17:43:27.834288 containerd[1632]: time="2025-05-27T17:43:27.834214860Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 860.284338ms" May 27 17:43:27.834288 containerd[1632]: time="2025-05-27T17:43:27.834244391Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 17:43:27.834772 containerd[1632]: time="2025-05-27T17:43:27.834754085Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 17:43:27.863927 containerd[1632]: time="2025-05-27T17:43:27.863487996Z" level=info msg="CreateContainer within sandbox \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:43:27.967559 containerd[1632]: time="2025-05-27T17:43:27.967529352Z" level=info msg="Container 74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:27.968646 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount380759081.mount: Deactivated successfully. May 27 17:43:28.018594 containerd[1632]: time="2025-05-27T17:43:28.018558560Z" level=info msg="CreateContainer within sandbox \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\"" May 27 17:43:28.019130 containerd[1632]: time="2025-05-27T17:43:28.018918636Z" level=info msg="StartContainer for \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\"" May 27 17:43:28.029932 containerd[1632]: time="2025-05-27T17:43:28.020189930Z" level=info msg="connecting to shim 74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca" address="unix:///run/containerd/s/5d36c9bf69fb7d8fca9dc604312f4450ba3ecace46ebbfeab8e8332ccfe555b0" protocol=ttrpc version=3 May 27 17:43:28.053152 systemd[1]: Started cri-containerd-74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca.scope - libcontainer container 74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca. May 27 17:43:28.117838 containerd[1632]: time="2025-05-27T17:43:28.117812075Z" level=info msg="StartContainer for \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" returns successfully" May 27 17:43:28.206058 kubelet[2918]: I0527 17:43:28.206012 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-657c668ffb-m5qhs" podStartSLOduration=30.101266696 podStartE2EDuration="35.205987139s" podCreationTimestamp="2025-05-27 17:42:53 +0000 UTC" firstStartedPulling="2025-05-27 17:43:22.730203636 +0000 UTC m=+43.891694773" lastFinishedPulling="2025-05-27 17:43:27.834924079 +0000 UTC m=+48.996415216" observedRunningTime="2025-05-27 17:43:28.175730143 +0000 UTC m=+49.337221289" watchObservedRunningTime="2025-05-27 17:43:28.205987139 +0000 UTC m=+49.367478278" May 27 17:43:29.005474 kubelet[2918]: I0527 17:43:29.004679 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7965f8d8cd-mnv87" podStartSLOduration=30.651331216 podStartE2EDuration="35.004659238s" podCreationTimestamp="2025-05-27 17:42:54 +0000 UTC" firstStartedPulling="2025-05-27 17:43:22.620310726 +0000 UTC m=+43.781801863" lastFinishedPulling="2025-05-27 17:43:26.973638745 +0000 UTC m=+48.135129885" observedRunningTime="2025-05-27 17:43:28.205635845 +0000 UTC m=+49.367126986" watchObservedRunningTime="2025-05-27 17:43:29.004659238 +0000 UTC m=+50.166150379" May 27 17:43:29.428033 kubelet[2918]: I0527 17:43:29.427793 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:29.428405 kubelet[2918]: I0527 17:43:29.428394 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:30.346254 containerd[1632]: time="2025-05-27T17:43:30.346102776Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:30.355699 containerd[1632]: time="2025-05-27T17:43:30.355671906Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 27 17:43:30.361974 containerd[1632]: time="2025-05-27T17:43:30.361710750Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:30.366521 containerd[1632]: time="2025-05-27T17:43:30.366192181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:30.366678 containerd[1632]: time="2025-05-27T17:43:30.366609423Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 2.531839465s" May 27 17:43:30.366678 containerd[1632]: time="2025-05-27T17:43:30.366629156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 27 17:43:30.372133 containerd[1632]: time="2025-05-27T17:43:30.367267125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:43:30.383567 containerd[1632]: time="2025-05-27T17:43:30.383539617Z" level=info msg="CreateContainer within sandbox \"b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 17:43:30.464117 containerd[1632]: time="2025-05-27T17:43:30.463396405Z" level=info msg="Container 6d0ca08fd7087580346be4a41cc45cd266463c23471158c7548666ea1e7ba999: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:30.499313 containerd[1632]: time="2025-05-27T17:43:30.499274995Z" level=info msg="CreateContainer within sandbox \"b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"6d0ca08fd7087580346be4a41cc45cd266463c23471158c7548666ea1e7ba999\"" May 27 17:43:30.499861 containerd[1632]: time="2025-05-27T17:43:30.499837238Z" level=info msg="StartContainer for \"6d0ca08fd7087580346be4a41cc45cd266463c23471158c7548666ea1e7ba999\"" May 27 17:43:30.506716 containerd[1632]: time="2025-05-27T17:43:30.501861777Z" level=info msg="connecting to shim 6d0ca08fd7087580346be4a41cc45cd266463c23471158c7548666ea1e7ba999" address="unix:///run/containerd/s/a523145612eb5edfe48abf0db3e4731a06eb0fcc26113a1e450cafff0f8aaeda" protocol=ttrpc version=3 May 27 17:43:30.542124 systemd[1]: Started cri-containerd-6d0ca08fd7087580346be4a41cc45cd266463c23471158c7548666ea1e7ba999.scope - libcontainer container 6d0ca08fd7087580346be4a41cc45cd266463c23471158c7548666ea1e7ba999. May 27 17:43:30.577128 containerd[1632]: time="2025-05-27T17:43:30.577091021Z" level=info msg="StartContainer for \"6d0ca08fd7087580346be4a41cc45cd266463c23471158c7548666ea1e7ba999\" returns successfully" May 27 17:43:31.118486 containerd[1632]: time="2025-05-27T17:43:31.118447916Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:31.119741 containerd[1632]: time="2025-05-27T17:43:31.119718636Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 17:43:31.124480 containerd[1632]: time="2025-05-27T17:43:31.124405231Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 757.122361ms" May 27 17:43:31.124480 containerd[1632]: time="2025-05-27T17:43:31.124430678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 17:43:31.125564 containerd[1632]: time="2025-05-27T17:43:31.125219725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:43:31.129723 containerd[1632]: time="2025-05-27T17:43:31.129693268Z" level=info msg="CreateContainer within sandbox \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:43:31.140069 containerd[1632]: time="2025-05-27T17:43:31.140033376Z" level=info msg="Container 5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:31.147546 containerd[1632]: time="2025-05-27T17:43:31.147485482Z" level=info msg="CreateContainer within sandbox \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\"" May 27 17:43:31.148005 containerd[1632]: time="2025-05-27T17:43:31.147930290Z" level=info msg="StartContainer for \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\"" May 27 17:43:31.149141 containerd[1632]: time="2025-05-27T17:43:31.149104758Z" level=info msg="connecting to shim 5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df" address="unix:///run/containerd/s/d1233ef0c517a062dc9a775cf5f08f854adff9ce7c445d030b8227919c2dc5e3" protocol=ttrpc version=3 May 27 17:43:31.174135 systemd[1]: Started cri-containerd-5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df.scope - libcontainer container 5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df. May 27 17:43:31.248723 containerd[1632]: time="2025-05-27T17:43:31.248697897Z" level=info msg="StartContainer for \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" returns successfully" May 27 17:43:31.386589 containerd[1632]: time="2025-05-27T17:43:31.386282000Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:43:31.387901 containerd[1632]: time="2025-05-27T17:43:31.387649141Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:43:31.388331 kubelet[2918]: E0527 17:43:31.388061 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:43:31.388331 kubelet[2918]: E0527 17:43:31.388092 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:43:31.389931 kubelet[2918]: I0527 17:43:31.388988 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-657c668ffb-nhkv6" podStartSLOduration=30.191295421 podStartE2EDuration="38.388880665s" podCreationTimestamp="2025-05-27 17:42:53 +0000 UTC" firstStartedPulling="2025-05-27 17:43:22.927572941 +0000 UTC m=+44.089064077" lastFinishedPulling="2025-05-27 17:43:31.125158178 +0000 UTC m=+52.286649321" observedRunningTime="2025-05-27 17:43:31.386757534 +0000 UTC m=+52.548248675" watchObservedRunningTime="2025-05-27 17:43:31.388880665 +0000 UTC m=+52.550371810" May 27 17:43:31.390017 containerd[1632]: time="2025-05-27T17:43:31.389022304Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:43:31.390017 containerd[1632]: time="2025-05-27T17:43:31.389791632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 17:43:31.404166 kubelet[2918]: E0527 17:43:31.404097 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:e89948687a8847d995f11dcd4865307d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2m4vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-84f848d4b6-xtg49_calico-system(2b64553a-fefe-49c7-909d-c50ce828eef7): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:43:32.374157 kubelet[2918]: I0527 17:43:32.374133 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:33.236974 containerd[1632]: time="2025-05-27T17:43:33.236925554Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:33.237643 containerd[1632]: time="2025-05-27T17:43:33.237518199Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 27 17:43:33.238542 containerd[1632]: time="2025-05-27T17:43:33.238497070Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:33.240783 containerd[1632]: time="2025-05-27T17:43:33.240725352Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:43:33.241625 containerd[1632]: time="2025-05-27T17:43:33.241314723Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 1.851501498s" May 27 17:43:33.241625 containerd[1632]: time="2025-05-27T17:43:33.241349732Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 27 17:43:33.243201 containerd[1632]: time="2025-05-27T17:43:33.243179713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:43:33.244682 containerd[1632]: time="2025-05-27T17:43:33.243784767Z" level=info msg="CreateContainer within sandbox \"b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 17:43:33.265339 containerd[1632]: time="2025-05-27T17:43:33.265296232Z" level=info msg="Container 907116204bdb244aeeda613aee5ec6d473a68b6e905d5166ab3e286f505c34f6: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:33.291705 containerd[1632]: time="2025-05-27T17:43:33.291674998Z" level=info msg="CreateContainer within sandbox \"b4afbc8a4e4e87913f4a8ce4472c442bec7de181d0ea8eb72d6ffe88f760fef6\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"907116204bdb244aeeda613aee5ec6d473a68b6e905d5166ab3e286f505c34f6\"" May 27 17:43:33.292260 containerd[1632]: time="2025-05-27T17:43:33.292231144Z" level=info msg="StartContainer for \"907116204bdb244aeeda613aee5ec6d473a68b6e905d5166ab3e286f505c34f6\"" May 27 17:43:33.293900 containerd[1632]: time="2025-05-27T17:43:33.293877297Z" level=info msg="connecting to shim 907116204bdb244aeeda613aee5ec6d473a68b6e905d5166ab3e286f505c34f6" address="unix:///run/containerd/s/a523145612eb5edfe48abf0db3e4731a06eb0fcc26113a1e450cafff0f8aaeda" protocol=ttrpc version=3 May 27 17:43:33.318155 systemd[1]: Started cri-containerd-907116204bdb244aeeda613aee5ec6d473a68b6e905d5166ab3e286f505c34f6.scope - libcontainer container 907116204bdb244aeeda613aee5ec6d473a68b6e905d5166ab3e286f505c34f6. May 27 17:43:33.352023 containerd[1632]: time="2025-05-27T17:43:33.351971510Z" level=info msg="StartContainer for \"907116204bdb244aeeda613aee5ec6d473a68b6e905d5166ab3e286f505c34f6\" returns successfully" May 27 17:43:33.385140 kubelet[2918]: I0527 17:43:33.385009 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-dhvg9" podStartSLOduration=26.990821893 podStartE2EDuration="37.384989438s" podCreationTimestamp="2025-05-27 17:42:56 +0000 UTC" firstStartedPulling="2025-05-27 17:43:22.848101685 +0000 UTC m=+44.009592824" lastFinishedPulling="2025-05-27 17:43:33.242269224 +0000 UTC m=+54.403760369" observedRunningTime="2025-05-27 17:43:33.384362899 +0000 UTC m=+54.545854036" watchObservedRunningTime="2025-05-27 17:43:33.384989438 +0000 UTC m=+54.546480579" May 27 17:43:33.497958 containerd[1632]: time="2025-05-27T17:43:33.497836143Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:43:33.498418 containerd[1632]: time="2025-05-27T17:43:33.498327026Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:43:33.498418 containerd[1632]: time="2025-05-27T17:43:33.498389798Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:43:33.498627 kubelet[2918]: E0527 17:43:33.498585 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:43:33.498697 kubelet[2918]: E0527 17:43:33.498634 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:43:33.499287 kubelet[2918]: E0527 17:43:33.498772 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m4vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-84f848d4b6-xtg49_calico-system(2b64553a-fefe-49c7-909d-c50ce828eef7): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:43:33.504666 kubelet[2918]: E0527 17:43:33.504608 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7" May 27 17:43:34.356146 kubelet[2918]: I0527 17:43:34.353292 2918 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 17:43:34.362600 kubelet[2918]: I0527 17:43:34.362503 2918 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 17:43:36.828161 kubelet[2918]: I0527 17:43:36.827983 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:36.922155 kubelet[2918]: I0527 17:43:36.922128 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:36.972390 containerd[1632]: time="2025-05-27T17:43:36.972353364Z" level=info msg="StopContainer for \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" with timeout 30 (s)" May 27 17:43:36.979590 containerd[1632]: time="2025-05-27T17:43:36.979539625Z" level=info msg="Stop container \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" with signal terminated" May 27 17:43:37.012933 systemd[1]: cri-containerd-74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca.scope: Deactivated successfully. May 27 17:43:37.057417 systemd[1]: Created slice kubepods-besteffort-poda7d4a404_ff16_4e99_a7f6_4e0753180905.slice - libcontainer container kubepods-besteffort-poda7d4a404_ff16_4e99_a7f6_4e0753180905.slice. May 27 17:43:37.059323 kubelet[2918]: I0527 17:43:37.058594 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a7d4a404-ff16-4e99-a7f6-4e0753180905-calico-apiserver-certs\") pod \"calico-apiserver-7965f8d8cd-s9pk4\" (UID: \"a7d4a404-ff16-4e99-a7f6-4e0753180905\") " pod="calico-apiserver/calico-apiserver-7965f8d8cd-s9pk4" May 27 17:43:37.059323 kubelet[2918]: I0527 17:43:37.058643 2918 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqt56\" (UniqueName: \"kubernetes.io/projected/a7d4a404-ff16-4e99-a7f6-4e0753180905-kube-api-access-kqt56\") pod \"calico-apiserver-7965f8d8cd-s9pk4\" (UID: \"a7d4a404-ff16-4e99-a7f6-4e0753180905\") " pod="calico-apiserver/calico-apiserver-7965f8d8cd-s9pk4" May 27 17:43:37.069431 containerd[1632]: time="2025-05-27T17:43:37.069401806Z" level=info msg="TaskExit event in podsandbox handler container_id:\"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" id:\"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" pid:5151 exit_status:1 exited_at:{seconds:1748367817 nanos:62161052}" May 27 17:43:37.072472 containerd[1632]: time="2025-05-27T17:43:37.069909958Z" level=info msg="received exit event container_id:\"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" id:\"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" pid:5151 exit_status:1 exited_at:{seconds:1748367817 nanos:62161052}" May 27 17:43:37.092172 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca-rootfs.mount: Deactivated successfully. May 27 17:43:37.324758 containerd[1632]: time="2025-05-27T17:43:37.323140702Z" level=info msg="StopContainer for \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" returns successfully" May 27 17:43:37.331026 containerd[1632]: time="2025-05-27T17:43:37.330990103Z" level=info msg="StopPodSandbox for \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\"" May 27 17:43:37.336314 containerd[1632]: time="2025-05-27T17:43:37.336291634Z" level=info msg="Container to stop \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 27 17:43:37.340988 systemd[1]: cri-containerd-94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca.scope: Deactivated successfully. May 27 17:43:37.341198 systemd[1]: cri-containerd-94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca.scope: Consumed 33ms CPU time, 5.7M memory peak, 1.1M read from disk. May 27 17:43:37.355563 containerd[1632]: time="2025-05-27T17:43:37.355451591Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" id:\"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" pid:4800 exit_status:137 exited_at:{seconds:1748367817 nanos:355249784}" May 27 17:43:37.367740 containerd[1632]: time="2025-05-27T17:43:37.367039409Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7965f8d8cd-s9pk4,Uid:a7d4a404-ff16-4e99-a7f6-4e0753180905,Namespace:calico-apiserver,Attempt:0,}" May 27 17:43:37.380636 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca-rootfs.mount: Deactivated successfully. May 27 17:43:37.392468 containerd[1632]: time="2025-05-27T17:43:37.392254357Z" level=info msg="shim disconnected" id=94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca namespace=k8s.io May 27 17:43:37.392468 containerd[1632]: time="2025-05-27T17:43:37.392273738Z" level=warning msg="cleaning up after shim disconnected" id=94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca namespace=k8s.io May 27 17:43:37.400374 containerd[1632]: time="2025-05-27T17:43:37.392278581Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 27 17:43:37.534387 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca-shm.mount: Deactivated successfully. May 27 17:43:37.543133 containerd[1632]: time="2025-05-27T17:43:37.543094147Z" level=info msg="received exit event sandbox_id:\"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" exit_status:137 exited_at:{seconds:1748367817 nanos:355249784}" May 27 17:43:37.863598 systemd-networkd[1518]: cali2ab93dc9b9c: Link DOWN May 27 17:43:37.863608 systemd-networkd[1518]: cali2ab93dc9b9c: Lost carrier May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:37.858 [INFO][5384] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:37.859 [INFO][5384] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" iface="eth0" netns="/var/run/netns/cni-bcbd86c7-03c3-4edd-4d74-96990793ff1e" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:37.860 [INFO][5384] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" iface="eth0" netns="/var/run/netns/cni-bcbd86c7-03c3-4edd-4d74-96990793ff1e" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:37.868 [INFO][5384] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" after=7.922682ms iface="eth0" netns="/var/run/netns/cni-bcbd86c7-03c3-4edd-4d74-96990793ff1e" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:37.868 [INFO][5384] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:37.868 [INFO][5384] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:38.125 [INFO][5400] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:38.129 [INFO][5400] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:38.129 [INFO][5400] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:38.174 [INFO][5400] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:38.174 [INFO][5400] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:38.177 [INFO][5400] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:38.181920 containerd[1632]: 2025-05-27 17:43:38.178 [INFO][5384] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:38.185593 systemd[1]: run-netns-cni\x2dbcbd86c7\x2d03c3\x2d4edd\x2d4d74\x2d96990793ff1e.mount: Deactivated successfully. May 27 17:43:38.187474 containerd[1632]: time="2025-05-27T17:43:38.187452211Z" level=info msg="TearDown network for sandbox \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" successfully" May 27 17:43:38.188044 containerd[1632]: time="2025-05-27T17:43:38.187585191Z" level=info msg="StopPodSandbox for \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" returns successfully" May 27 17:43:38.245790 systemd-networkd[1518]: califb99b4d1202: Link UP May 27 17:43:38.246557 systemd-networkd[1518]: califb99b4d1202: Gained carrier May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:37.850 [INFO][5351] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0 calico-apiserver-7965f8d8cd- calico-apiserver a7d4a404-ff16-4e99-a7f6-4e0753180905 1085 0 2025-05-27 17:43:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7965f8d8cd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7965f8d8cd-s9pk4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califb99b4d1202 [] [] }} ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-s9pk4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:37.853 [INFO][5351] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-s9pk4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.125 [INFO][5395] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" HandleID="k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Workload="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.129 [INFO][5395] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" HandleID="k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Workload="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003f83a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7965f8d8cd-s9pk4", "timestamp":"2025-05-27 17:43:38.125829672 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.129 [INFO][5395] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.177 [INFO][5395] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.178 [INFO][5395] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.190 [INFO][5395] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.198 [INFO][5395] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.203 [INFO][5395] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.205 [INFO][5395] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.212 [INFO][5395] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.212 [INFO][5395] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.213 [INFO][5395] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.216 [INFO][5395] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.225 [INFO][5395] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.138/26] block=192.168.88.128/26 handle="k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.225 [INFO][5395] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.138/26] handle="k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" host="localhost" May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.225 [INFO][5395] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:38.261842 containerd[1632]: 2025-05-27 17:43:38.225 [INFO][5395] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.138/26] IPv6=[] ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" HandleID="k8s-pod-network.0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Workload="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" May 27 17:43:38.266299 containerd[1632]: 2025-05-27 17:43:38.228 [INFO][5351] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-s9pk4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0", GenerateName:"calico-apiserver-7965f8d8cd-", Namespace:"calico-apiserver", SelfLink:"", UID:"a7d4a404-ff16-4e99-a7f6-4e0753180905", ResourceVersion:"1085", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 43, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7965f8d8cd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7965f8d8cd-s9pk4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califb99b4d1202", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:38.266299 containerd[1632]: 2025-05-27 17:43:38.228 [INFO][5351] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.138/32] ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-s9pk4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" May 27 17:43:38.266299 containerd[1632]: 2025-05-27 17:43:38.228 [INFO][5351] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califb99b4d1202 ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-s9pk4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" May 27 17:43:38.266299 containerd[1632]: 2025-05-27 17:43:38.250 [INFO][5351] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-s9pk4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" May 27 17:43:38.266299 containerd[1632]: 2025-05-27 17:43:38.250 [INFO][5351] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-s9pk4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0", GenerateName:"calico-apiserver-7965f8d8cd-", Namespace:"calico-apiserver", SelfLink:"", UID:"a7d4a404-ff16-4e99-a7f6-4e0753180905", ResourceVersion:"1085", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 43, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7965f8d8cd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa", Pod:"calico-apiserver-7965f8d8cd-s9pk4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califb99b4d1202", MAC:"da:f0:46:c2:3f:a9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:43:38.266299 containerd[1632]: 2025-05-27 17:43:38.257 [INFO][5351] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" Namespace="calico-apiserver" Pod="calico-apiserver-7965f8d8cd-s9pk4" WorkloadEndpoint="localhost-k8s-calico--apiserver--7965f8d8cd--s9pk4-eth0" May 27 17:43:38.288802 kubelet[2918]: I0527 17:43:38.288780 2918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtz2z\" (UniqueName: \"kubernetes.io/projected/1db7bd7d-cd20-46eb-a3b9-8fab3073829e-kube-api-access-qtz2z\") pod \"1db7bd7d-cd20-46eb-a3b9-8fab3073829e\" (UID: \"1db7bd7d-cd20-46eb-a3b9-8fab3073829e\") " May 27 17:43:38.289714 kubelet[2918]: I0527 17:43:38.289136 2918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1db7bd7d-cd20-46eb-a3b9-8fab3073829e-calico-apiserver-certs\") pod \"1db7bd7d-cd20-46eb-a3b9-8fab3073829e\" (UID: \"1db7bd7d-cd20-46eb-a3b9-8fab3073829e\") " May 27 17:43:38.327528 systemd[1]: var-lib-kubelet-pods-1db7bd7d\x2dcd20\x2d46eb\x2da3b9\x2d8fab3073829e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqtz2z.mount: Deactivated successfully. May 27 17:43:38.332355 containerd[1632]: time="2025-05-27T17:43:38.331453257Z" level=info msg="connecting to shim 0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa" address="unix:///run/containerd/s/c8c3d70859425b10049147054885b43f51ccb0f80a22427c9c38cd8c0e4fa529" namespace=k8s.io protocol=ttrpc version=3 May 27 17:43:38.341223 systemd[1]: var-lib-kubelet-pods-1db7bd7d\x2dcd20\x2d46eb\x2da3b9\x2d8fab3073829e-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 27 17:43:38.346712 kubelet[2918]: I0527 17:43:38.344125 2918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db7bd7d-cd20-46eb-a3b9-8fab3073829e-kube-api-access-qtz2z" (OuterVolumeSpecName: "kube-api-access-qtz2z") pod "1db7bd7d-cd20-46eb-a3b9-8fab3073829e" (UID: "1db7bd7d-cd20-46eb-a3b9-8fab3073829e"). InnerVolumeSpecName "kube-api-access-qtz2z". PluginName "kubernetes.io/projected", VolumeGidValue "" May 27 17:43:38.346712 kubelet[2918]: I0527 17:43:38.344048 2918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1db7bd7d-cd20-46eb-a3b9-8fab3073829e-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "1db7bd7d-cd20-46eb-a3b9-8fab3073829e" (UID: "1db7bd7d-cd20-46eb-a3b9-8fab3073829e"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 27 17:43:38.377141 systemd[1]: Started cri-containerd-0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa.scope - libcontainer container 0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa. May 27 17:43:38.391009 kubelet[2918]: I0527 17:43:38.389721 2918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtz2z\" (UniqueName: \"kubernetes.io/projected/1db7bd7d-cd20-46eb-a3b9-8fab3073829e-kube-api-access-qtz2z\") on node \"localhost\" DevicePath \"\"" May 27 17:43:38.391009 kubelet[2918]: I0527 17:43:38.389737 2918 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1db7bd7d-cd20-46eb-a3b9-8fab3073829e-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" May 27 17:43:38.398359 systemd-resolved[1519]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 27 17:43:38.429116 containerd[1632]: time="2025-05-27T17:43:38.425333668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7965f8d8cd-s9pk4,Uid:a7d4a404-ff16-4e99-a7f6-4e0753180905,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa\"" May 27 17:43:38.452841 systemd[1]: Removed slice kubepods-besteffort-pod1db7bd7d_cd20_46eb_a3b9_8fab3073829e.slice - libcontainer container kubepods-besteffort-pod1db7bd7d_cd20_46eb_a3b9_8fab3073829e.slice. May 27 17:43:38.452912 systemd[1]: kubepods-besteffort-pod1db7bd7d_cd20_46eb_a3b9_8fab3073829e.slice: Consumed 732ms CPU time, 46.5M memory peak, 1.1M read from disk. May 27 17:43:38.479433 containerd[1632]: time="2025-05-27T17:43:38.479384444Z" level=info msg="CreateContainer within sandbox \"0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:43:38.530710 kubelet[2918]: I0527 17:43:38.530685 2918 scope.go:117] "RemoveContainer" containerID="74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca" May 27 17:43:38.531984 containerd[1632]: time="2025-05-27T17:43:38.531964736Z" level=info msg="RemoveContainer for \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\"" May 27 17:43:38.534960 containerd[1632]: time="2025-05-27T17:43:38.534935744Z" level=info msg="Container 9fe9911760d0230b36f0097f5f7ad1d81b2a0f709e4d7a388c6a607539149f67: CDI devices from CRI Config.CDIDevices: []" May 27 17:43:38.567809 containerd[1632]: time="2025-05-27T17:43:38.567782456Z" level=info msg="CreateContainer within sandbox \"0656944cbe5812586db090642a83ef0d646746d8e00c7c63459af9c474c89dfa\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9fe9911760d0230b36f0097f5f7ad1d81b2a0f709e4d7a388c6a607539149f67\"" May 27 17:43:38.573304 containerd[1632]: time="2025-05-27T17:43:38.568081789Z" level=info msg="StartContainer for \"9fe9911760d0230b36f0097f5f7ad1d81b2a0f709e4d7a388c6a607539149f67\"" May 27 17:43:38.573304 containerd[1632]: time="2025-05-27T17:43:38.568719322Z" level=info msg="connecting to shim 9fe9911760d0230b36f0097f5f7ad1d81b2a0f709e4d7a388c6a607539149f67" address="unix:///run/containerd/s/c8c3d70859425b10049147054885b43f51ccb0f80a22427c9c38cd8c0e4fa529" protocol=ttrpc version=3 May 27 17:43:38.580402 containerd[1632]: time="2025-05-27T17:43:38.580375044Z" level=info msg="RemoveContainer for \"74117781718fede22808b98de561ef9e125b96b2a540225a7f2cc631002e3bca\" returns successfully" May 27 17:43:38.591081 systemd[1]: Started cri-containerd-9fe9911760d0230b36f0097f5f7ad1d81b2a0f709e4d7a388c6a607539149f67.scope - libcontainer container 9fe9911760d0230b36f0097f5f7ad1d81b2a0f709e4d7a388c6a607539149f67. May 27 17:43:38.651775 containerd[1632]: time="2025-05-27T17:43:38.651744627Z" level=info msg="StartContainer for \"9fe9911760d0230b36f0097f5f7ad1d81b2a0f709e4d7a388c6a607539149f67\" returns successfully" May 27 17:43:39.071084 containerd[1632]: time="2025-05-27T17:43:39.069953235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:43:39.111383 containerd[1632]: time="2025-05-27T17:43:39.111170446Z" level=info msg="StopPodSandbox for \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\"" May 27 17:43:39.113459 kubelet[2918]: I0527 17:43:39.113303 2918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db7bd7d-cd20-46eb-a3b9-8fab3073829e" path="/var/lib/kubelet/pods/1db7bd7d-cd20-46eb-a3b9-8fab3073829e/volumes" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.161 [WARNING][5519] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.161 [INFO][5519] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.161 [INFO][5519] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" iface="eth0" netns="" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.161 [INFO][5519] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.161 [INFO][5519] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.187 [INFO][5527] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.187 [INFO][5527] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.188 [INFO][5527] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.192 [WARNING][5527] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.192 [INFO][5527] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.193 [INFO][5527] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:39.198107 containerd[1632]: 2025-05-27 17:43:39.195 [INFO][5519] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:39.198697 containerd[1632]: time="2025-05-27T17:43:39.198142514Z" level=info msg="TearDown network for sandbox \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" successfully" May 27 17:43:39.198697 containerd[1632]: time="2025-05-27T17:43:39.198174269Z" level=info msg="StopPodSandbox for \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" returns successfully" May 27 17:43:39.208684 containerd[1632]: time="2025-05-27T17:43:39.208481542Z" level=info msg="RemovePodSandbox for \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\"" May 27 17:43:39.208684 containerd[1632]: time="2025-05-27T17:43:39.208564731Z" level=info msg="Forcibly stopping sandbox \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\"" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.244 [WARNING][5541] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.244 [INFO][5541] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.244 [INFO][5541] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" iface="eth0" netns="" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.244 [INFO][5541] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.244 [INFO][5541] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.264 [INFO][5548] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.264 [INFO][5548] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.264 [INFO][5548] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.273 [WARNING][5548] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.273 [INFO][5548] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" HandleID="k8s-pod-network.94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" Workload="localhost-k8s-calico--apiserver--657c668ffb--m5qhs-eth0" May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.275 [INFO][5548] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:39.279295 containerd[1632]: 2025-05-27 17:43:39.277 [INFO][5541] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca" May 27 17:43:39.280438 containerd[1632]: time="2025-05-27T17:43:39.279918568Z" level=info msg="TearDown network for sandbox \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" successfully" May 27 17:43:39.287360 containerd[1632]: time="2025-05-27T17:43:39.287326495Z" level=info msg="Ensure that sandbox 94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca in task-service has been cleanup successfully" May 27 17:43:39.292724 containerd[1632]: time="2025-05-27T17:43:39.292694042Z" level=info msg="RemovePodSandbox \"94eb2676e909bc8352205f1417d179c04566aba6e387b0427efedd5085aafaca\" returns successfully" May 27 17:43:39.339056 containerd[1632]: time="2025-05-27T17:43:39.338917973Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:43:39.342738 containerd[1632]: time="2025-05-27T17:43:39.342321579Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:43:39.343020 containerd[1632]: time="2025-05-27T17:43:39.342363501Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:43:39.343188 kubelet[2918]: E0527 17:43:39.343163 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:43:39.346557 kubelet[2918]: E0527 17:43:39.346443 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:43:39.348749 kubelet[2918]: E0527 17:43:39.348071 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlq8p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-vtwqz_calico-system(f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:43:39.349249 kubelet[2918]: E0527 17:43:39.349216 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:43:39.439806 kubelet[2918]: I0527 17:43:39.439766 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7965f8d8cd-s9pk4" podStartSLOduration=3.439749562 podStartE2EDuration="3.439749562s" podCreationTimestamp="2025-05-27 17:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:43:39.439028259 +0000 UTC m=+60.600519399" watchObservedRunningTime="2025-05-27 17:43:39.439749562 +0000 UTC m=+60.601240703" May 27 17:43:39.576129 systemd-networkd[1518]: califb99b4d1202: Gained IPv6LL May 27 17:43:40.432558 kubelet[2918]: I0527 17:43:40.432533 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:41.396665 kubelet[2918]: I0527 17:43:41.396563 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:41.792265 kubelet[2918]: I0527 17:43:41.792197 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:43:42.063599 containerd[1632]: time="2025-05-27T17:43:42.063522795Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24\" id:\"273b01f14682bef7d7edcee2d9ef17041e792c787995746099c506289dc72a29\" pid:5575 exited_at:{seconds:1748367822 nanos:31311875}" May 27 17:43:42.463126 containerd[1632]: time="2025-05-27T17:43:42.462927001Z" level=info msg="StopContainer for \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" with timeout 30 (s)" May 27 17:43:42.463706 containerd[1632]: time="2025-05-27T17:43:42.463647715Z" level=info msg="Stop container \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" with signal terminated" May 27 17:43:42.485692 containerd[1632]: time="2025-05-27T17:43:42.484856893Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" id:\"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" pid:5228 exit_status:1 exited_at:{seconds:1748367822 nanos:484288178}" May 27 17:43:42.485692 containerd[1632]: time="2025-05-27T17:43:42.485117631Z" level=info msg="received exit event container_id:\"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" id:\"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" pid:5228 exit_status:1 exited_at:{seconds:1748367822 nanos:484288178}" May 27 17:43:42.487715 systemd[1]: cri-containerd-5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df.scope: Deactivated successfully. May 27 17:43:42.517369 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df-rootfs.mount: Deactivated successfully. May 27 17:43:42.540510 containerd[1632]: time="2025-05-27T17:43:42.540482917Z" level=info msg="StopContainer for \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" returns successfully" May 27 17:43:42.541419 containerd[1632]: time="2025-05-27T17:43:42.540762995Z" level=info msg="StopPodSandbox for \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\"" May 27 17:43:42.543464 containerd[1632]: time="2025-05-27T17:43:42.543328221Z" level=info msg="Container to stop \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 27 17:43:42.547608 systemd[1]: cri-containerd-85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd.scope: Deactivated successfully. May 27 17:43:42.551341 containerd[1632]: time="2025-05-27T17:43:42.551321662Z" level=info msg="TaskExit event in podsandbox handler container_id:\"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" id:\"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" pid:5005 exit_status:137 exited_at:{seconds:1748367822 nanos:551129170}" May 27 17:43:42.570793 containerd[1632]: time="2025-05-27T17:43:42.570729462Z" level=info msg="shim disconnected" id=85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd namespace=k8s.io May 27 17:43:42.570793 containerd[1632]: time="2025-05-27T17:43:42.570751422Z" level=warning msg="cleaning up after shim disconnected" id=85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd namespace=k8s.io May 27 17:43:42.571068 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd-rootfs.mount: Deactivated successfully. May 27 17:43:42.578213 containerd[1632]: time="2025-05-27T17:43:42.570758386Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 27 17:43:42.596380 containerd[1632]: time="2025-05-27T17:43:42.594592107Z" level=info msg="received exit event sandbox_id:\"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" exit_status:137 exited_at:{seconds:1748367822 nanos:551129170}" May 27 17:43:42.596299 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd-shm.mount: Deactivated successfully. May 27 17:43:42.632134 systemd-networkd[1518]: cali5eef2231ecc: Link DOWN May 27 17:43:42.632485 systemd-networkd[1518]: cali5eef2231ecc: Lost carrier May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.629 [INFO][5657] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.631 [INFO][5657] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" iface="eth0" netns="/var/run/netns/cni-9a3a6fc9-1f83-1f35-8c5d-60fc050ca4e7" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.631 [INFO][5657] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" iface="eth0" netns="/var/run/netns/cni-9a3a6fc9-1f83-1f35-8c5d-60fc050ca4e7" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.637 [INFO][5657] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" after=6.523686ms iface="eth0" netns="/var/run/netns/cni-9a3a6fc9-1f83-1f35-8c5d-60fc050ca4e7" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.637 [INFO][5657] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.637 [INFO][5657] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.658 [INFO][5669] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.658 [INFO][5669] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.658 [INFO][5669] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.678 [INFO][5669] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.678 [INFO][5669] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.679 [INFO][5669] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:43:42.682962 containerd[1632]: 2025-05-27 17:43:42.681 [INFO][5657] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:43:42.685789 containerd[1632]: time="2025-05-27T17:43:42.683633888Z" level=info msg="TearDown network for sandbox \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" successfully" May 27 17:43:42.685789 containerd[1632]: time="2025-05-27T17:43:42.683659201Z" level=info msg="StopPodSandbox for \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" returns successfully" May 27 17:43:42.685636 systemd[1]: run-netns-cni\x2d9a3a6fc9\x2d1f83\x2d1f35\x2d8c5d\x2d60fc050ca4e7.mount: Deactivated successfully. May 27 17:43:42.822620 kubelet[2918]: I0527 17:43:42.822441 2918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8fkg\" (UniqueName: \"kubernetes.io/projected/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9-kube-api-access-z8fkg\") pod \"82fd1cbc-6be0-4a3a-8294-e11be1d04cd9\" (UID: \"82fd1cbc-6be0-4a3a-8294-e11be1d04cd9\") " May 27 17:43:42.822620 kubelet[2918]: I0527 17:43:42.822504 2918 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9-calico-apiserver-certs\") pod \"82fd1cbc-6be0-4a3a-8294-e11be1d04cd9\" (UID: \"82fd1cbc-6be0-4a3a-8294-e11be1d04cd9\") " May 27 17:43:42.836678 systemd[1]: var-lib-kubelet-pods-82fd1cbc\x2d6be0\x2d4a3a\x2d8294\x2de11be1d04cd9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dz8fkg.mount: Deactivated successfully. May 27 17:43:42.836900 systemd[1]: var-lib-kubelet-pods-82fd1cbc\x2d6be0\x2d4a3a\x2d8294\x2de11be1d04cd9-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 27 17:43:42.844776 kubelet[2918]: I0527 17:43:42.844726 2918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9-kube-api-access-z8fkg" (OuterVolumeSpecName: "kube-api-access-z8fkg") pod "82fd1cbc-6be0-4a3a-8294-e11be1d04cd9" (UID: "82fd1cbc-6be0-4a3a-8294-e11be1d04cd9"). InnerVolumeSpecName "kube-api-access-z8fkg". PluginName "kubernetes.io/projected", VolumeGidValue "" May 27 17:43:42.845976 kubelet[2918]: I0527 17:43:42.845950 2918 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "82fd1cbc-6be0-4a3a-8294-e11be1d04cd9" (UID: "82fd1cbc-6be0-4a3a-8294-e11be1d04cd9"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 27 17:43:42.923591 kubelet[2918]: I0527 17:43:42.923556 2918 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" May 27 17:43:42.923591 kubelet[2918]: I0527 17:43:42.923583 2918 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8fkg\" (UniqueName: \"kubernetes.io/projected/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9-kube-api-access-z8fkg\") on node \"localhost\" DevicePath \"\"" May 27 17:43:43.025703 systemd[1]: Removed slice kubepods-besteffort-pod82fd1cbc_6be0_4a3a_8294_e11be1d04cd9.slice - libcontainer container kubepods-besteffort-pod82fd1cbc_6be0_4a3a_8294_e11be1d04cd9.slice. May 27 17:43:43.437665 kubelet[2918]: I0527 17:43:43.437569 2918 scope.go:117] "RemoveContainer" containerID="5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df" May 27 17:43:43.442075 containerd[1632]: time="2025-05-27T17:43:43.442052988Z" level=info msg="RemoveContainer for \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\"" May 27 17:43:43.445107 containerd[1632]: time="2025-05-27T17:43:43.445079460Z" level=info msg="RemoveContainer for \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" returns successfully" May 27 17:43:43.445875 kubelet[2918]: I0527 17:43:43.445266 2918 scope.go:117] "RemoveContainer" containerID="5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df" May 27 17:43:43.445956 containerd[1632]: time="2025-05-27T17:43:43.445676771Z" level=error msg="ContainerStatus for \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\": not found" May 27 17:43:43.453355 kubelet[2918]: E0527 17:43:43.453329 2918 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\": not found" containerID="5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df" May 27 17:43:43.458571 kubelet[2918]: I0527 17:43:43.453359 2918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df"} err="failed to get container status \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\": rpc error: code = NotFound desc = an error occurred when try to find container \"5ac079ce6346230785ac3a2e955b3c4dc5ff841c798b839a01fa35d91058d6df\": not found" May 27 17:43:44.983717 kubelet[2918]: I0527 17:43:44.983553 2918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82fd1cbc-6be0-4a3a-8294-e11be1d04cd9" path="/var/lib/kubelet/pods/82fd1cbc-6be0-4a3a-8294-e11be1d04cd9/volumes" May 27 17:43:45.984162 kubelet[2918]: E0527 17:43:45.984080 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7" May 27 17:43:48.081741 containerd[1632]: time="2025-05-27T17:43:48.081682747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d\" id:\"6643bae5eb58f1a51ebf861bb0d91fd11c5d0995c9c9ee7bdf394c8ecfa28388\" pid:5707 exited_at:{seconds:1748367828 nanos:81471322}" May 27 17:43:54.008699 kubelet[2918]: E0527 17:43:54.008475 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:43:59.042451 containerd[1632]: time="2025-05-27T17:43:59.042267036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:43:59.347553 containerd[1632]: time="2025-05-27T17:43:59.346606599Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:43:59.354114 containerd[1632]: time="2025-05-27T17:43:59.354020016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:43:59.355157 containerd[1632]: time="2025-05-27T17:43:59.355115977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:43:59.383755 kubelet[2918]: E0527 17:43:59.354286 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:43:59.414951 kubelet[2918]: E0527 17:43:59.414908 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:43:59.456800 kubelet[2918]: E0527 17:43:59.456758 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:e89948687a8847d995f11dcd4865307d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2m4vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-84f848d4b6-xtg49_calico-system(2b64553a-fefe-49c7-909d-c50ce828eef7): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:43:59.465967 containerd[1632]: time="2025-05-27T17:43:59.459135183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:43:59.727450 containerd[1632]: time="2025-05-27T17:43:59.727159728Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:43:59.735510 containerd[1632]: time="2025-05-27T17:43:59.735482825Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:43:59.735785 containerd[1632]: time="2025-05-27T17:43:59.735539528Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:43:59.735814 kubelet[2918]: E0527 17:43:59.735631 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:43:59.735814 kubelet[2918]: E0527 17:43:59.735664 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:43:59.735814 kubelet[2918]: E0527 17:43:59.735735 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m4vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-84f848d4b6-xtg49_calico-system(2b64553a-fefe-49c7-909d-c50ce828eef7): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:43:59.737076 kubelet[2918]: E0527 17:43:59.737033 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7" May 27 17:44:03.124038 systemd[1]: Started sshd@8-139.178.70.105:22-139.178.89.65:49374.service - OpenSSH per-connection server daemon (139.178.89.65:49374). May 27 17:44:03.464855 sshd[5733]: Accepted publickey for core from 139.178.89.65 port 49374 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:03.469080 sshd-session[5733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:03.475817 systemd-logind[1607]: New session 10 of user core. May 27 17:44:03.483233 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 17:44:06.067887 sshd[5735]: Connection closed by 139.178.89.65 port 49374 May 27 17:44:06.068288 sshd-session[5733]: pam_unix(sshd:session): session closed for user core May 27 17:44:06.074410 systemd-logind[1607]: Session 10 logged out. Waiting for processes to exit. May 27 17:44:06.074724 systemd[1]: sshd@8-139.178.70.105:22-139.178.89.65:49374.service: Deactivated successfully. May 27 17:44:06.076890 systemd[1]: session-10.scope: Deactivated successfully. May 27 17:44:06.078120 systemd-logind[1607]: Removed session 10. May 27 17:44:08.548906 containerd[1632]: time="2025-05-27T17:44:08.548871111Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d\" id:\"539e80619f3047b8f1b68d0e66ee0e8ecac507d43b7f9bdcc9652efa6d8cafb2\" pid:5761 exited_at:{seconds:1748367848 nanos:541795788}" May 27 17:44:09.073777 containerd[1632]: time="2025-05-27T17:44:09.073751972Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:44:09.299288 containerd[1632]: time="2025-05-27T17:44:09.299180169Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:44:09.299521 containerd[1632]: time="2025-05-27T17:44:09.299500540Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:44:09.299621 containerd[1632]: time="2025-05-27T17:44:09.299559992Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:44:09.299670 kubelet[2918]: E0527 17:44:09.299639 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:44:09.300572 kubelet[2918]: E0527 17:44:09.299678 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:44:09.300572 kubelet[2918]: E0527 17:44:09.299757 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlq8p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-vtwqz_calico-system(f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:44:09.301073 kubelet[2918]: E0527 17:44:09.301050 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:44:10.986260 kubelet[2918]: E0527 17:44:10.986116 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7" May 27 17:44:11.079810 systemd[1]: Started sshd@9-139.178.70.105:22-139.178.89.65:49380.service - OpenSSH per-connection server daemon (139.178.89.65:49380). May 27 17:44:11.242671 sshd[5773]: Accepted publickey for core from 139.178.89.65 port 49380 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:11.243713 sshd-session[5773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:11.246885 systemd-logind[1607]: New session 11 of user core. May 27 17:44:11.251119 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 17:44:11.722626 sshd[5775]: Connection closed by 139.178.89.65 port 49380 May 27 17:44:11.722834 sshd-session[5773]: pam_unix(sshd:session): session closed for user core May 27 17:44:11.733401 systemd[1]: sshd@9-139.178.70.105:22-139.178.89.65:49380.service: Deactivated successfully. May 27 17:44:11.734889 systemd[1]: session-11.scope: Deactivated successfully. May 27 17:44:11.735808 systemd-logind[1607]: Session 11 logged out. Waiting for processes to exit. May 27 17:44:11.737306 systemd-logind[1607]: Removed session 11. May 27 17:44:12.926051 containerd[1632]: time="2025-05-27T17:44:12.925948747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24\" id:\"2fda1653f9e38fca6605cd5120d50ae74896b3d855ebc34058bdb4d8809c8daa\" pid:5796 exited_at:{seconds:1748367852 nanos:925699864}" May 27 17:44:16.734405 systemd[1]: Started sshd@10-139.178.70.105:22-139.178.89.65:43882.service - OpenSSH per-connection server daemon (139.178.89.65:43882). May 27 17:44:17.648140 sshd[5814]: Accepted publickey for core from 139.178.89.65 port 43882 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:17.671564 sshd-session[5814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:17.691693 systemd-logind[1607]: New session 12 of user core. May 27 17:44:17.699141 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 17:44:17.960406 containerd[1632]: time="2025-05-27T17:44:17.960340878Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d\" id:\"e8dd7f03ad2e02ca359dd75583a61c5ecbc0583a82c522e389af6e4e09a5402f\" pid:5835 exited_at:{seconds:1748367857 nanos:960097477}" May 27 17:44:18.610495 sshd[5816]: Connection closed by 139.178.89.65 port 43882 May 27 17:44:18.611887 sshd-session[5814]: pam_unix(sshd:session): session closed for user core May 27 17:44:18.621446 systemd[1]: sshd@10-139.178.70.105:22-139.178.89.65:43882.service: Deactivated successfully. May 27 17:44:18.623364 systemd[1]: session-12.scope: Deactivated successfully. May 27 17:44:18.624610 systemd-logind[1607]: Session 12 logged out. Waiting for processes to exit. May 27 17:44:18.626570 systemd[1]: Started sshd@11-139.178.70.105:22-139.178.89.65:43884.service - OpenSSH per-connection server daemon (139.178.89.65:43884). May 27 17:44:18.628722 systemd-logind[1607]: Removed session 12. May 27 17:44:18.671178 sshd[5849]: Accepted publickey for core from 139.178.89.65 port 43884 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:18.672824 sshd-session[5849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:18.677153 systemd-logind[1607]: New session 13 of user core. May 27 17:44:18.682142 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 17:44:18.941561 sshd[5851]: Connection closed by 139.178.89.65 port 43884 May 27 17:44:18.943220 sshd-session[5849]: pam_unix(sshd:session): session closed for user core May 27 17:44:18.950990 systemd[1]: sshd@11-139.178.70.105:22-139.178.89.65:43884.service: Deactivated successfully. May 27 17:44:18.952765 systemd[1]: session-13.scope: Deactivated successfully. May 27 17:44:18.953367 systemd-logind[1607]: Session 13 logged out. Waiting for processes to exit. May 27 17:44:18.959206 systemd[1]: Started sshd@12-139.178.70.105:22-139.178.89.65:43892.service - OpenSSH per-connection server daemon (139.178.89.65:43892). May 27 17:44:18.965234 systemd-logind[1607]: Removed session 13. May 27 17:44:19.017169 sshd[5865]: Accepted publickey for core from 139.178.89.65 port 43892 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:19.019578 sshd-session[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:19.027559 systemd-logind[1607]: New session 14 of user core. May 27 17:44:19.031358 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 17:44:19.247150 sshd[5867]: Connection closed by 139.178.89.65 port 43892 May 27 17:44:19.250307 systemd-logind[1607]: Session 14 logged out. Waiting for processes to exit. May 27 17:44:19.248035 sshd-session[5865]: pam_unix(sshd:session): session closed for user core May 27 17:44:19.250388 systemd[1]: sshd@12-139.178.70.105:22-139.178.89.65:43892.service: Deactivated successfully. May 27 17:44:19.251517 systemd[1]: session-14.scope: Deactivated successfully. May 27 17:44:19.252232 systemd-logind[1607]: Removed session 14. May 27 17:44:21.046448 kubelet[2918]: E0527 17:44:21.046386 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:44:21.994835 kubelet[2918]: E0527 17:44:21.994788 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7" May 27 17:44:24.262522 systemd[1]: Started sshd@13-139.178.70.105:22-139.178.89.65:43780.service - OpenSSH per-connection server daemon (139.178.89.65:43780). May 27 17:44:24.378322 sshd[5883]: Accepted publickey for core from 139.178.89.65 port 43780 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:24.379910 sshd-session[5883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:24.385121 systemd-logind[1607]: New session 15 of user core. May 27 17:44:24.394239 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 17:44:24.689836 sshd[5885]: Connection closed by 139.178.89.65 port 43780 May 27 17:44:24.689758 sshd-session[5883]: pam_unix(sshd:session): session closed for user core May 27 17:44:24.699807 systemd[1]: sshd@13-139.178.70.105:22-139.178.89.65:43780.service: Deactivated successfully. May 27 17:44:24.704684 systemd[1]: session-15.scope: Deactivated successfully. May 27 17:44:24.706197 systemd-logind[1607]: Session 15 logged out. Waiting for processes to exit. May 27 17:44:24.711312 systemd[1]: Started sshd@14-139.178.70.105:22-139.178.89.65:43792.service - OpenSSH per-connection server daemon (139.178.89.65:43792). May 27 17:44:24.713109 systemd-logind[1607]: Removed session 15. May 27 17:44:24.784022 sshd[5897]: Accepted publickey for core from 139.178.89.65 port 43792 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:24.785688 sshd-session[5897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:24.797690 systemd-logind[1607]: New session 16 of user core. May 27 17:44:24.803320 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 17:44:25.229919 sshd[5899]: Connection closed by 139.178.89.65 port 43792 May 27 17:44:25.229500 sshd-session[5897]: pam_unix(sshd:session): session closed for user core May 27 17:44:25.239485 systemd[1]: Started sshd@15-139.178.70.105:22-139.178.89.65:43802.service - OpenSSH per-connection server daemon (139.178.89.65:43802). May 27 17:44:25.243388 systemd[1]: sshd@14-139.178.70.105:22-139.178.89.65:43792.service: Deactivated successfully. May 27 17:44:25.248365 systemd[1]: session-16.scope: Deactivated successfully. May 27 17:44:25.249832 systemd-logind[1607]: Session 16 logged out. Waiting for processes to exit. May 27 17:44:25.252341 systemd-logind[1607]: Removed session 16. May 27 17:44:25.396724 sshd[5906]: Accepted publickey for core from 139.178.89.65 port 43802 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:25.398391 sshd-session[5906]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:25.403308 systemd-logind[1607]: New session 17 of user core. May 27 17:44:25.412237 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 17:44:28.180452 sshd[5911]: Connection closed by 139.178.89.65 port 43802 May 27 17:44:28.225963 sshd-session[5906]: pam_unix(sshd:session): session closed for user core May 27 17:44:28.245569 systemd[1]: Started sshd@16-139.178.70.105:22-139.178.89.65:43816.service - OpenSSH per-connection server daemon (139.178.89.65:43816). May 27 17:44:28.260603 systemd[1]: sshd@15-139.178.70.105:22-139.178.89.65:43802.service: Deactivated successfully. May 27 17:44:28.263697 systemd[1]: session-17.scope: Deactivated successfully. May 27 17:44:28.264695 systemd[1]: session-17.scope: Consumed 491ms CPU time, 68.8M memory peak. May 27 17:44:28.269322 systemd-logind[1607]: Session 17 logged out. Waiting for processes to exit. May 27 17:44:28.272243 systemd-logind[1607]: Removed session 17. May 27 17:44:28.416016 sshd[5923]: Accepted publickey for core from 139.178.89.65 port 43816 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:28.419123 sshd-session[5923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:28.428169 systemd-logind[1607]: New session 18 of user core. May 27 17:44:28.432133 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 17:44:31.308258 sshd[5930]: Connection closed by 139.178.89.65 port 43816 May 27 17:44:31.348550 sshd-session[5923]: pam_unix(sshd:session): session closed for user core May 27 17:44:31.370537 systemd[1]: Started sshd@17-139.178.70.105:22-139.178.89.65:43828.service - OpenSSH per-connection server daemon (139.178.89.65:43828). May 27 17:44:31.373895 systemd[1]: sshd@16-139.178.70.105:22-139.178.89.65:43816.service: Deactivated successfully. May 27 17:44:31.375621 systemd[1]: session-18.scope: Deactivated successfully. May 27 17:44:31.379125 systemd-logind[1607]: Session 18 logged out. Waiting for processes to exit. May 27 17:44:31.383553 systemd-logind[1607]: Removed session 18. May 27 17:44:31.741522 sshd[5939]: Accepted publickey for core from 139.178.89.65 port 43828 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:31.746275 sshd-session[5939]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:31.766386 systemd-logind[1607]: New session 19 of user core. May 27 17:44:31.773151 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 17:44:32.360539 sshd[5944]: Connection closed by 139.178.89.65 port 43828 May 27 17:44:32.364558 systemd[1]: sshd@17-139.178.70.105:22-139.178.89.65:43828.service: Deactivated successfully. May 27 17:44:32.360893 sshd-session[5939]: pam_unix(sshd:session): session closed for user core May 27 17:44:32.367106 systemd[1]: session-19.scope: Deactivated successfully. May 27 17:44:32.370514 systemd-logind[1607]: Session 19 logged out. Waiting for processes to exit. May 27 17:44:32.373646 systemd-logind[1607]: Removed session 19. May 27 17:44:33.124452 kubelet[2918]: E0527 17:44:33.123320 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:44:33.982549 kubelet[2918]: E0527 17:44:33.982511 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7" May 27 17:44:37.373715 systemd[1]: Started sshd@18-139.178.70.105:22-139.178.89.65:57546.service - OpenSSH per-connection server daemon (139.178.89.65:57546). May 27 17:44:37.681757 sshd[5965]: Accepted publickey for core from 139.178.89.65 port 57546 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:37.683305 sshd-session[5965]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:37.694783 systemd-logind[1607]: New session 20 of user core. May 27 17:44:37.700625 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 17:44:38.493078 sshd[5967]: Connection closed by 139.178.89.65 port 57546 May 27 17:44:38.493313 sshd-session[5965]: pam_unix(sshd:session): session closed for user core May 27 17:44:38.496949 systemd[1]: sshd@18-139.178.70.105:22-139.178.89.65:57546.service: Deactivated successfully. May 27 17:44:38.498470 systemd[1]: session-20.scope: Deactivated successfully. May 27 17:44:38.499249 systemd-logind[1607]: Session 20 logged out. Waiting for processes to exit. May 27 17:44:38.500389 systemd-logind[1607]: Removed session 20. May 27 17:44:39.388148 containerd[1632]: time="2025-05-27T17:44:39.380342277Z" level=info msg="StopPodSandbox for \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\"" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:40.425 [WARNING][5988] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:40.431 [INFO][5988] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:40.431 [INFO][5988] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" iface="eth0" netns="" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:40.431 [INFO][5988] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:40.431 [INFO][5988] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:41.270 [INFO][6001] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:41.300 [INFO][6001] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:41.306 [INFO][6001] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:41.391 [WARNING][6001] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:41.391 [INFO][6001] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:41.392 [INFO][6001] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:44:41.395663 containerd[1632]: 2025-05-27 17:44:41.393 [INFO][5988] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:44:41.435685 containerd[1632]: time="2025-05-27T17:44:41.399526661Z" level=info msg="TearDown network for sandbox \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" successfully" May 27 17:44:41.435685 containerd[1632]: time="2025-05-27T17:44:41.399557619Z" level=info msg="StopPodSandbox for \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" returns successfully" May 27 17:44:41.505985 containerd[1632]: time="2025-05-27T17:44:41.505957248Z" level=info msg="RemovePodSandbox for \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\"" May 27 17:44:41.505985 containerd[1632]: time="2025-05-27T17:44:41.505987312Z" level=info msg="Forcibly stopping sandbox \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\"" May 27 17:44:43.744027 systemd[1]: Started sshd@19-139.178.70.105:22-139.178.89.65:54408.service - OpenSSH per-connection server daemon (139.178.89.65:54408). May 27 17:44:44.778970 sshd[6051]: Accepted publickey for core from 139.178.89.65 port 54408 ssh2: RSA SHA256:oeXmtaCEort+f/+5eVsmlYpuBsOFaMex1/8ZdPv/dCc May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:42.926 [WARNING][6016] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" WorkloadEndpoint="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:42.975 [INFO][6016] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:42.975 [INFO][6016] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" iface="eth0" netns="" May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:42.975 [INFO][6016] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:42.975 [INFO][6016] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:44.702 [INFO][6046] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:44.719 [INFO][6046] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:44.729 [INFO][6046] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:44.798 [WARNING][6046] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:44.798 [INFO][6046] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" HandleID="k8s-pod-network.85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" Workload="localhost-k8s-calico--apiserver--657c668ffb--nhkv6-eth0" May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:44.800 [INFO][6046] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:44:44.848247 containerd[1632]: 2025-05-27 17:44:44.802 [INFO][6016] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd" May 27 17:44:44.836676 sshd-session[6051]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:44:44.904234 containerd[1632]: time="2025-05-27T17:44:44.863244201Z" level=info msg="TearDown network for sandbox \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" successfully" May 27 17:44:44.916473 systemd-logind[1607]: New session 21 of user core. May 27 17:44:44.921105 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 17:44:44.981016 containerd[1632]: time="2025-05-27T17:44:44.980951113Z" level=info msg="Ensure that sandbox 85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd in task-service has been cleanup successfully" May 27 17:44:45.274152 containerd[1632]: time="2025-05-27T17:44:45.274120277Z" level=info msg="RemovePodSandbox \"85aa82ab13e40c1ebade2e52268d2b6b1fea819290162341a4912e7132c313dd\" returns successfully" May 27 17:44:45.663409 containerd[1632]: time="2025-05-27T17:44:45.663331449Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e2d52b8c4fe1936ce0b5e3314842f0e397e2091d6f77187224b31cfb8bc9c24\" id:\"303020489a1f3ada46ec1088a221cf323d2921102b941ace0fad953a158c6a3b\" pid:6032 exited_at:{seconds:1748367885 nanos:484459952}" May 27 17:44:47.836506 sshd[6056]: Connection closed by 139.178.89.65 port 54408 May 27 17:44:47.838317 sshd-session[6051]: pam_unix(sshd:session): session closed for user core May 27 17:44:47.878025 systemd[1]: sshd@19-139.178.70.105:22-139.178.89.65:54408.service: Deactivated successfully. May 27 17:44:47.893563 systemd[1]: session-21.scope: Deactivated successfully. May 27 17:44:47.895321 systemd-logind[1607]: Session 21 logged out. Waiting for processes to exit. May 27 17:44:47.898053 systemd-logind[1607]: Removed session 21. May 27 17:44:48.420237 containerd[1632]: time="2025-05-27T17:44:48.420142800Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff64fcf68a494d596be3762109c08f9ab72de046ada27e7111574c8827e3eb1d\" id:\"7a3da368f18dca6e3d22da96ccf0b0adbe92925987ca01c636eb3b5f2ddd14dc\" pid:6106 exited_at:{seconds:1748367888 nanos:419567511}" May 27 17:44:48.690195 kubelet[2918]: E0527 17:44:48.689766 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-vtwqz" podUID="f2e9e0e3-b52a-4cbc-87d0-04cf8f359b7d" May 27 17:44:49.296010 containerd[1632]: time="2025-05-27T17:44:49.295367714Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:44:49.712793 containerd[1632]: time="2025-05-27T17:44:49.712703095Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:44:49.723530 containerd[1632]: time="2025-05-27T17:44:49.723461603Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:44:49.723609 containerd[1632]: time="2025-05-27T17:44:49.723554888Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:44:49.741435 kubelet[2918]: E0527 17:44:49.741330 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:44:49.761026 kubelet[2918]: E0527 17:44:49.751239 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:44:49.881429 kubelet[2918]: E0527 17:44:49.881377 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:e89948687a8847d995f11dcd4865307d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2m4vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-84f848d4b6-xtg49_calico-system(2b64553a-fefe-49c7-909d-c50ce828eef7): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:44:49.883222 containerd[1632]: time="2025-05-27T17:44:49.883133698Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:44:50.153487 containerd[1632]: time="2025-05-27T17:44:50.153336972Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:44:50.156804 containerd[1632]: time="2025-05-27T17:44:50.156688219Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:44:50.156804 containerd[1632]: time="2025-05-27T17:44:50.156772259Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:44:50.156936 kubelet[2918]: E0527 17:44:50.156874 2918 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:44:50.156936 kubelet[2918]: E0527 17:44:50.156910 2918 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:44:50.157082 kubelet[2918]: E0527 17:44:50.156989 2918 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m4vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-84f848d4b6-xtg49_calico-system(2b64553a-fefe-49c7-909d-c50ce828eef7): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:44:50.158465 kubelet[2918]: E0527 17:44:50.158431 2918 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-84f848d4b6-xtg49" podUID="2b64553a-fefe-49c7-909d-c50ce828eef7"