May 16 16:39:42.717655 kernel: Linux version 6.12.20-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri May 16 14:52:24 -00 2025 May 16 16:39:42.717670 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e3be1f8a550c199f4f838f30cb661b44d98bde818b7f263cba125cc457a9c137 May 16 16:39:42.717676 kernel: Disabled fast string operations May 16 16:39:42.717680 kernel: BIOS-provided physical RAM map: May 16 16:39:42.717684 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable May 16 16:39:42.717688 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved May 16 16:39:42.717694 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved May 16 16:39:42.717698 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable May 16 16:39:42.717702 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data May 16 16:39:42.717706 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS May 16 16:39:42.717710 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable May 16 16:39:42.717714 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved May 16 16:39:42.717718 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved May 16 16:39:42.717722 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved May 16 16:39:42.717727 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved May 16 16:39:42.717732 kernel: NX (Execute Disable) protection: active May 16 16:39:42.717737 kernel: APIC: Static calls initialized May 16 16:39:42.717741 kernel: SMBIOS 2.7 present. May 16 16:39:42.717746 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 May 16 16:39:42.717751 kernel: DMI: Memory slots populated: 1/128 May 16 16:39:42.717756 kernel: vmware: hypercall mode: 0x00 May 16 16:39:42.717761 kernel: Hypervisor detected: VMware May 16 16:39:42.717765 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz May 16 16:39:42.717770 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz May 16 16:39:42.717774 kernel: vmware: using clock offset of 3369874323 ns May 16 16:39:42.717779 kernel: tsc: Detected 3408.000 MHz processor May 16 16:39:42.717784 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 16 16:39:42.717789 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 16 16:39:42.717793 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 May 16 16:39:42.717798 kernel: total RAM covered: 3072M May 16 16:39:42.717803 kernel: Found optimal setting for mtrr clean up May 16 16:39:42.717808 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G May 16 16:39:42.717813 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs May 16 16:39:42.717818 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 16 16:39:42.717822 kernel: Using GB pages for direct mapping May 16 16:39:42.717827 kernel: ACPI: Early table checksum verification disabled May 16 16:39:42.717832 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) May 16 16:39:42.717836 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) May 16 16:39:42.717841 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) May 16 16:39:42.717847 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) May 16 16:39:42.717853 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 May 16 16:39:42.717858 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 May 16 16:39:42.717863 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) May 16 16:39:42.717868 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) May 16 16:39:42.717873 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) May 16 16:39:42.717878 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) May 16 16:39:42.717883 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) May 16 16:39:42.717888 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) May 16 16:39:42.717893 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] May 16 16:39:42.717898 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] May 16 16:39:42.717903 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] May 16 16:39:42.717908 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] May 16 16:39:42.717913 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] May 16 16:39:42.717917 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] May 16 16:39:42.717923 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] May 16 16:39:42.717928 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] May 16 16:39:42.717933 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] May 16 16:39:42.717937 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] May 16 16:39:42.717942 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] May 16 16:39:42.717947 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] May 16 16:39:42.717952 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug May 16 16:39:42.717957 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00001000-0x7fffffff] May 16 16:39:42.717962 kernel: NODE_DATA(0) allocated [mem 0x7fff8dc0-0x7fffffff] May 16 16:39:42.717968 kernel: Zone ranges: May 16 16:39:42.717973 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 16 16:39:42.717977 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] May 16 16:39:42.717982 kernel: Normal empty May 16 16:39:42.717987 kernel: Device empty May 16 16:39:42.717992 kernel: Movable zone start for each node May 16 16:39:42.717997 kernel: Early memory node ranges May 16 16:39:42.718002 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] May 16 16:39:42.718006 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] May 16 16:39:42.718011 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] May 16 16:39:42.718017 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] May 16 16:39:42.718022 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 16 16:39:42.718027 kernel: On node 0, zone DMA: 98 pages in unavailable ranges May 16 16:39:42.718032 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges May 16 16:39:42.718036 kernel: ACPI: PM-Timer IO Port: 0x1008 May 16 16:39:42.718041 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) May 16 16:39:42.718046 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) May 16 16:39:42.718089 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) May 16 16:39:42.718094 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) May 16 16:39:42.718101 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) May 16 16:39:42.718106 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) May 16 16:39:42.718111 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) May 16 16:39:42.718116 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) May 16 16:39:42.718120 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) May 16 16:39:42.718125 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) May 16 16:39:42.718130 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) May 16 16:39:42.718134 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) May 16 16:39:42.718139 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) May 16 16:39:42.718144 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) May 16 16:39:42.718150 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) May 16 16:39:42.718154 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) May 16 16:39:42.718159 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) May 16 16:39:42.718164 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) May 16 16:39:42.718169 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) May 16 16:39:42.718173 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) May 16 16:39:42.718178 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) May 16 16:39:42.718183 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) May 16 16:39:42.718187 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) May 16 16:39:42.718193 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) May 16 16:39:42.718198 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) May 16 16:39:42.718203 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) May 16 16:39:42.718207 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) May 16 16:39:42.718212 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) May 16 16:39:42.718217 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) May 16 16:39:42.718221 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) May 16 16:39:42.718226 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) May 16 16:39:42.718231 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) May 16 16:39:42.718236 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) May 16 16:39:42.718241 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) May 16 16:39:42.718246 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) May 16 16:39:42.718251 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) May 16 16:39:42.718256 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) May 16 16:39:42.718260 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) May 16 16:39:42.718265 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) May 16 16:39:42.718270 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) May 16 16:39:42.718278 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) May 16 16:39:42.718284 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) May 16 16:39:42.718288 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) May 16 16:39:42.718294 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) May 16 16:39:42.718300 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) May 16 16:39:42.718305 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) May 16 16:39:42.718310 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) May 16 16:39:42.718315 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) May 16 16:39:42.718320 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) May 16 16:39:42.718325 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) May 16 16:39:42.718330 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) May 16 16:39:42.718336 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) May 16 16:39:42.718341 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) May 16 16:39:42.718346 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) May 16 16:39:42.718351 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) May 16 16:39:42.718356 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) May 16 16:39:42.718361 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) May 16 16:39:42.718366 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) May 16 16:39:42.718371 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) May 16 16:39:42.718376 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) May 16 16:39:42.718381 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) May 16 16:39:42.718387 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) May 16 16:39:42.718392 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) May 16 16:39:42.718397 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) May 16 16:39:42.718402 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) May 16 16:39:42.718407 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) May 16 16:39:42.718412 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) May 16 16:39:42.718417 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) May 16 16:39:42.718422 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) May 16 16:39:42.718427 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) May 16 16:39:42.718432 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) May 16 16:39:42.718438 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) May 16 16:39:42.718443 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) May 16 16:39:42.718448 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) May 16 16:39:42.718453 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) May 16 16:39:42.718458 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) May 16 16:39:42.718463 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) May 16 16:39:42.718468 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) May 16 16:39:42.718473 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) May 16 16:39:42.718478 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) May 16 16:39:42.718483 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) May 16 16:39:42.718489 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) May 16 16:39:42.718494 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) May 16 16:39:42.718499 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) May 16 16:39:42.718504 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) May 16 16:39:42.718509 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) May 16 16:39:42.718514 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) May 16 16:39:42.718519 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) May 16 16:39:42.718524 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) May 16 16:39:42.718529 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) May 16 16:39:42.718535 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) May 16 16:39:42.718540 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) May 16 16:39:42.718545 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) May 16 16:39:42.718550 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) May 16 16:39:42.718555 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) May 16 16:39:42.718561 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) May 16 16:39:42.718566 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) May 16 16:39:42.718571 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) May 16 16:39:42.718576 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) May 16 16:39:42.718581 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) May 16 16:39:42.718587 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) May 16 16:39:42.718592 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) May 16 16:39:42.718597 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) May 16 16:39:42.718602 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) May 16 16:39:42.718607 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) May 16 16:39:42.718612 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) May 16 16:39:42.718618 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) May 16 16:39:42.718622 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) May 16 16:39:42.718628 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) May 16 16:39:42.718633 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) May 16 16:39:42.718638 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) May 16 16:39:42.718644 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) May 16 16:39:42.718649 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) May 16 16:39:42.718654 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) May 16 16:39:42.718659 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) May 16 16:39:42.718664 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) May 16 16:39:42.718669 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) May 16 16:39:42.718674 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) May 16 16:39:42.718679 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) May 16 16:39:42.718684 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) May 16 16:39:42.718690 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) May 16 16:39:42.718695 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) May 16 16:39:42.718700 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) May 16 16:39:42.718705 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) May 16 16:39:42.718710 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) May 16 16:39:42.718715 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) May 16 16:39:42.718720 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) May 16 16:39:42.718725 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) May 16 16:39:42.718730 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 May 16 16:39:42.718735 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) May 16 16:39:42.718741 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 16 16:39:42.718747 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 May 16 16:39:42.718752 kernel: TSC deadline timer available May 16 16:39:42.718757 kernel: CPU topo: Max. logical packages: 128 May 16 16:39:42.718762 kernel: CPU topo: Max. logical dies: 128 May 16 16:39:42.718767 kernel: CPU topo: Max. dies per package: 1 May 16 16:39:42.718772 kernel: CPU topo: Max. threads per core: 1 May 16 16:39:42.718777 kernel: CPU topo: Num. cores per package: 1 May 16 16:39:42.718782 kernel: CPU topo: Num. threads per package: 1 May 16 16:39:42.718788 kernel: CPU topo: Allowing 2 present CPUs plus 126 hotplug CPUs May 16 16:39:42.718793 kernel: [mem 0x80000000-0xefffffff] available for PCI devices May 16 16:39:42.718799 kernel: Booting paravirtualized kernel on VMware hypervisor May 16 16:39:42.718804 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 16 16:39:42.718809 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 May 16 16:39:42.718814 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 May 16 16:39:42.718820 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 May 16 16:39:42.718825 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 May 16 16:39:42.718830 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 May 16 16:39:42.718836 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 May 16 16:39:42.718841 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 May 16 16:39:42.718846 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 May 16 16:39:42.718851 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 May 16 16:39:42.718856 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 May 16 16:39:42.718861 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 May 16 16:39:42.718866 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 May 16 16:39:42.718871 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 May 16 16:39:42.718876 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 May 16 16:39:42.718882 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 May 16 16:39:42.718887 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 May 16 16:39:42.718892 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 May 16 16:39:42.718897 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 May 16 16:39:42.718902 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 May 16 16:39:42.718908 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e3be1f8a550c199f4f838f30cb661b44d98bde818b7f263cba125cc457a9c137 May 16 16:39:42.718913 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 16 16:39:42.718919 kernel: random: crng init done May 16 16:39:42.718924 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes May 16 16:39:42.718930 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes May 16 16:39:42.718935 kernel: printk: log_buf_len min size: 262144 bytes May 16 16:39:42.718940 kernel: printk: log_buf_len: 1048576 bytes May 16 16:39:42.718945 kernel: printk: early log buf free: 245576(93%) May 16 16:39:42.718950 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 16 16:39:42.718955 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 16 16:39:42.718960 kernel: Fallback order for Node 0: 0 May 16 16:39:42.718966 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524157 May 16 16:39:42.718972 kernel: Policy zone: DMA32 May 16 16:39:42.718977 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 16 16:39:42.718982 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 May 16 16:39:42.718988 kernel: ftrace: allocating 40065 entries in 157 pages May 16 16:39:42.718993 kernel: ftrace: allocated 157 pages with 5 groups May 16 16:39:42.718998 kernel: Dynamic Preempt: voluntary May 16 16:39:42.719003 kernel: rcu: Preemptible hierarchical RCU implementation. May 16 16:39:42.719008 kernel: rcu: RCU event tracing is enabled. May 16 16:39:42.719013 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. May 16 16:39:42.719019 kernel: Trampoline variant of Tasks RCU enabled. May 16 16:39:42.719025 kernel: Rude variant of Tasks RCU enabled. May 16 16:39:42.719030 kernel: Tracing variant of Tasks RCU enabled. May 16 16:39:42.719035 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 16 16:39:42.719040 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 May 16 16:39:42.719045 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 16 16:39:42.719066 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 16 16:39:42.719072 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 16 16:39:42.719078 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 May 16 16:39:42.719085 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. May 16 16:39:42.719109 kernel: Console: colour VGA+ 80x25 May 16 16:39:42.719132 kernel: printk: legacy console [tty0] enabled May 16 16:39:42.719137 kernel: printk: legacy console [ttyS0] enabled May 16 16:39:42.719156 kernel: ACPI: Core revision 20240827 May 16 16:39:42.719161 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns May 16 16:39:42.719166 kernel: APIC: Switch to symmetric I/O mode setup May 16 16:39:42.719172 kernel: x2apic enabled May 16 16:39:42.719177 kernel: APIC: Switched APIC routing to: physical x2apic May 16 16:39:42.719182 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 16 16:39:42.719189 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns May 16 16:39:42.719194 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) May 16 16:39:42.719199 kernel: Disabled fast string operations May 16 16:39:42.719204 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 May 16 16:39:42.719209 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 May 16 16:39:42.719214 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 16 16:39:42.719220 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit May 16 16:39:42.719225 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS May 16 16:39:42.719231 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 16 16:39:42.719236 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT May 16 16:39:42.719241 kernel: RETBleed: Mitigation: Enhanced IBRS May 16 16:39:42.719247 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 16 16:39:42.719252 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 16 16:39:42.719257 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode May 16 16:39:42.719262 kernel: SRBDS: Unknown: Dependent on hypervisor status May 16 16:39:42.719267 kernel: GDS: Unknown: Dependent on hypervisor status May 16 16:39:42.719273 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 16 16:39:42.719279 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 16 16:39:42.719284 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 16 16:39:42.719290 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 16 16:39:42.719295 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 16 16:39:42.719300 kernel: Freeing SMP alternatives memory: 32K May 16 16:39:42.719305 kernel: pid_max: default: 131072 minimum: 1024 May 16 16:39:42.719310 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 16 16:39:42.719315 kernel: landlock: Up and running. May 16 16:39:42.719321 kernel: SELinux: Initializing. May 16 16:39:42.719327 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 16 16:39:42.719332 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 16 16:39:42.719338 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) May 16 16:39:42.719343 kernel: Performance Events: Skylake events, core PMU driver. May 16 16:39:42.719348 kernel: core: CPUID marked event: 'cpu cycles' unavailable May 16 16:39:42.719353 kernel: core: CPUID marked event: 'instructions' unavailable May 16 16:39:42.719358 kernel: core: CPUID marked event: 'bus cycles' unavailable May 16 16:39:42.719363 kernel: core: CPUID marked event: 'cache references' unavailable May 16 16:39:42.719368 kernel: core: CPUID marked event: 'cache misses' unavailable May 16 16:39:42.719374 kernel: core: CPUID marked event: 'branch instructions' unavailable May 16 16:39:42.719379 kernel: core: CPUID marked event: 'branch misses' unavailable May 16 16:39:42.719384 kernel: ... version: 1 May 16 16:39:42.719390 kernel: ... bit width: 48 May 16 16:39:42.719395 kernel: ... generic registers: 4 May 16 16:39:42.719400 kernel: ... value mask: 0000ffffffffffff May 16 16:39:42.719405 kernel: ... max period: 000000007fffffff May 16 16:39:42.719410 kernel: ... fixed-purpose events: 0 May 16 16:39:42.719415 kernel: ... event mask: 000000000000000f May 16 16:39:42.719421 kernel: signal: max sigframe size: 1776 May 16 16:39:42.719426 kernel: rcu: Hierarchical SRCU implementation. May 16 16:39:42.719432 kernel: rcu: Max phase no-delay instances is 400. May 16 16:39:42.719437 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level May 16 16:39:42.719442 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 16 16:39:42.719447 kernel: smp: Bringing up secondary CPUs ... May 16 16:39:42.719452 kernel: smpboot: x86: Booting SMP configuration: May 16 16:39:42.719458 kernel: .... node #0, CPUs: #1 May 16 16:39:42.719463 kernel: Disabled fast string operations May 16 16:39:42.719469 kernel: smp: Brought up 1 node, 2 CPUs May 16 16:39:42.719474 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) May 16 16:39:42.719479 kernel: Memory: 1924280K/2096628K available (14336K kernel code, 2438K rwdata, 9944K rodata, 54416K init, 2544K bss, 160964K reserved, 0K cma-reserved) May 16 16:39:42.719485 kernel: devtmpfs: initialized May 16 16:39:42.719490 kernel: x86/mm: Memory block size: 128MB May 16 16:39:42.719495 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) May 16 16:39:42.719500 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 16 16:39:42.719505 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) May 16 16:39:42.719511 kernel: pinctrl core: initialized pinctrl subsystem May 16 16:39:42.719517 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 16 16:39:42.719522 kernel: audit: initializing netlink subsys (disabled) May 16 16:39:42.719527 kernel: audit: type=2000 audit(1747413580.064:1): state=initialized audit_enabled=0 res=1 May 16 16:39:42.719532 kernel: thermal_sys: Registered thermal governor 'step_wise' May 16 16:39:42.719537 kernel: thermal_sys: Registered thermal governor 'user_space' May 16 16:39:42.719543 kernel: cpuidle: using governor menu May 16 16:39:42.719548 kernel: Simple Boot Flag at 0x36 set to 0x80 May 16 16:39:42.719553 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 16 16:39:42.719558 kernel: dca service started, version 1.12.1 May 16 16:39:42.719564 kernel: PCI: ECAM [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) for domain 0000 [bus 00-7f] May 16 16:39:42.719576 kernel: PCI: Using configuration type 1 for base access May 16 16:39:42.719582 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 16 16:39:42.719587 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 16 16:39:42.719593 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 16 16:39:42.719598 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 16 16:39:42.719604 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 16 16:39:42.719609 kernel: ACPI: Added _OSI(Module Device) May 16 16:39:42.719615 kernel: ACPI: Added _OSI(Processor Device) May 16 16:39:42.719621 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 16 16:39:42.719627 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 16 16:39:42.719632 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 16 16:39:42.719638 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored May 16 16:39:42.719643 kernel: ACPI: Interpreter enabled May 16 16:39:42.719648 kernel: ACPI: PM: (supports S0 S1 S5) May 16 16:39:42.719654 kernel: ACPI: Using IOAPIC for interrupt routing May 16 16:39:42.719659 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 16 16:39:42.719665 kernel: PCI: Using E820 reservations for host bridge windows May 16 16:39:42.719671 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F May 16 16:39:42.719677 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) May 16 16:39:42.719746 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 16 16:39:42.719795 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] May 16 16:39:42.719840 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] May 16 16:39:42.719848 kernel: PCI host bridge to bus 0000:00 May 16 16:39:42.719894 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 16 16:39:42.719938 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] May 16 16:39:42.719978 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 16 16:39:42.720018 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 16 16:39:42.720092 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] May 16 16:39:42.720144 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] May 16 16:39:42.720217 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 conventional PCI endpoint May 16 16:39:42.720275 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 conventional PCI bridge May 16 16:39:42.720323 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 16 16:39:42.720376 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint May 16 16:39:42.720426 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a conventional PCI endpoint May 16 16:39:42.720475 kernel: pci 0000:00:07.1: BAR 4 [io 0x1060-0x106f] May 16 16:39:42.720521 kernel: pci 0000:00:07.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk May 16 16:39:42.720566 kernel: pci 0000:00:07.1: BAR 1 [io 0x03f6]: legacy IDE quirk May 16 16:39:42.720611 kernel: pci 0000:00:07.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk May 16 16:39:42.720657 kernel: pci 0000:00:07.1: BAR 3 [io 0x0376]: legacy IDE quirk May 16 16:39:42.720708 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint May 16 16:39:42.720756 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI May 16 16:39:42.720801 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB May 16 16:39:42.720850 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 conventional PCI endpoint May 16 16:39:42.720896 kernel: pci 0000:00:07.7: BAR 0 [io 0x1080-0x10bf] May 16 16:39:42.720942 kernel: pci 0000:00:07.7: BAR 1 [mem 0xfebfe000-0xfebfffff 64bit] May 16 16:39:42.720991 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 conventional PCI endpoint May 16 16:39:42.721037 kernel: pci 0000:00:0f.0: BAR 0 [io 0x1070-0x107f] May 16 16:39:42.721108 kernel: pci 0000:00:0f.0: BAR 1 [mem 0xe8000000-0xefffffff pref] May 16 16:39:42.721155 kernel: pci 0000:00:0f.0: BAR 2 [mem 0xfe000000-0xfe7fffff] May 16 16:39:42.721200 kernel: pci 0000:00:0f.0: ROM [mem 0x00000000-0x00007fff pref] May 16 16:39:42.721258 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 16 16:39:42.721307 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 conventional PCI bridge May 16 16:39:42.721353 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) May 16 16:39:42.721397 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] May 16 16:39:42.721444 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] May 16 16:39:42.721488 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] May 16 16:39:42.721539 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.721585 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] May 16 16:39:42.721630 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] May 16 16:39:42.721674 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] May 16 16:39:42.722534 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold May 16 16:39:42.722626 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.722676 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] May 16 16:39:42.722722 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] May 16 16:39:42.722769 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] May 16 16:39:42.722815 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] May 16 16:39:42.722860 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold May 16 16:39:42.722911 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.722959 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] May 16 16:39:42.723004 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] May 16 16:39:42.723061 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] May 16 16:39:42.723169 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] May 16 16:39:42.723214 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold May 16 16:39:42.723264 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.723312 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] May 16 16:39:42.723358 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] May 16 16:39:42.723403 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] May 16 16:39:42.723447 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold May 16 16:39:42.723495 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.723542 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] May 16 16:39:42.723587 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] May 16 16:39:42.723634 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] May 16 16:39:42.723681 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold May 16 16:39:42.723731 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.723777 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] May 16 16:39:42.723822 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] May 16 16:39:42.723866 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] May 16 16:39:42.723911 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold May 16 16:39:42.723961 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.724007 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] May 16 16:39:42.724061 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] May 16 16:39:42.724108 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] May 16 16:39:42.724153 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold May 16 16:39:42.724202 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.724248 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] May 16 16:39:42.724296 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] May 16 16:39:42.724340 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] May 16 16:39:42.724387 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold May 16 16:39:42.724435 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.724482 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] May 16 16:39:42.724527 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] May 16 16:39:42.724571 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] May 16 16:39:42.724616 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold May 16 16:39:42.724668 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.724714 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] May 16 16:39:42.724759 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] May 16 16:39:42.724803 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] May 16 16:39:42.724864 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] May 16 16:39:42.724909 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold May 16 16:39:42.724959 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.725007 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] May 16 16:39:42.725070 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] May 16 16:39:42.725160 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] May 16 16:39:42.725206 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] May 16 16:39:42.725252 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold May 16 16:39:42.725301 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.725347 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] May 16 16:39:42.725395 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] May 16 16:39:42.725441 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] May 16 16:39:42.725486 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold May 16 16:39:42.725535 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.725582 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] May 16 16:39:42.725627 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] May 16 16:39:42.725672 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] May 16 16:39:42.725719 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold May 16 16:39:42.725771 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.725818 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] May 16 16:39:42.725864 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] May 16 16:39:42.725909 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] May 16 16:39:42.725955 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold May 16 16:39:42.726018 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.726108 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] May 16 16:39:42.726156 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] May 16 16:39:42.726201 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] May 16 16:39:42.726246 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold May 16 16:39:42.726294 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.726340 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] May 16 16:39:42.726385 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] May 16 16:39:42.726432 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] May 16 16:39:42.726476 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold May 16 16:39:42.726525 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.726570 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] May 16 16:39:42.726615 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] May 16 16:39:42.726660 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] May 16 16:39:42.726704 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] May 16 16:39:42.726749 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold May 16 16:39:42.726800 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.726846 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] May 16 16:39:42.726890 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] May 16 16:39:42.726937 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] May 16 16:39:42.726981 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] May 16 16:39:42.727025 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold May 16 16:39:42.727087 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.727165 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] May 16 16:39:42.727210 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] May 16 16:39:42.727254 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] May 16 16:39:42.727302 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] May 16 16:39:42.727346 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold May 16 16:39:42.727395 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.727441 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] May 16 16:39:42.727486 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] May 16 16:39:42.727531 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] May 16 16:39:42.727576 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold May 16 16:39:42.727627 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.727675 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] May 16 16:39:42.727720 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] May 16 16:39:42.727766 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] May 16 16:39:42.727811 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold May 16 16:39:42.727860 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.727906 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] May 16 16:39:42.727953 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] May 16 16:39:42.727997 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] May 16 16:39:42.728042 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold May 16 16:39:42.728142 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.728220 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] May 16 16:39:42.728265 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] May 16 16:39:42.728309 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] May 16 16:39:42.728355 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold May 16 16:39:42.728407 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.728453 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] May 16 16:39:42.728498 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] May 16 16:39:42.728543 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] May 16 16:39:42.728589 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold May 16 16:39:42.728639 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.728685 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] May 16 16:39:42.728733 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] May 16 16:39:42.728778 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] May 16 16:39:42.728822 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] May 16 16:39:42.728867 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold May 16 16:39:42.728916 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.728962 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] May 16 16:39:42.729007 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] May 16 16:39:42.729066 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] May 16 16:39:42.729151 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] May 16 16:39:42.729197 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold May 16 16:39:42.729248 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.729294 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] May 16 16:39:42.729340 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] May 16 16:39:42.729385 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] May 16 16:39:42.729432 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold May 16 16:39:42.729481 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.729528 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] May 16 16:39:42.729573 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] May 16 16:39:42.729618 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] May 16 16:39:42.729677 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold May 16 16:39:42.729730 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.729780 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] May 16 16:39:42.729835 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] May 16 16:39:42.729881 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] May 16 16:39:42.729926 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold May 16 16:39:42.729976 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.730023 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] May 16 16:39:42.730087 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] May 16 16:39:42.730137 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] May 16 16:39:42.730182 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold May 16 16:39:42.730232 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.730278 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] May 16 16:39:42.730323 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] May 16 16:39:42.730367 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] May 16 16:39:42.730413 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold May 16 16:39:42.730465 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 PCIe Root Port May 16 16:39:42.730510 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] May 16 16:39:42.730556 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] May 16 16:39:42.730600 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] May 16 16:39:42.730646 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold May 16 16:39:42.730693 kernel: pci_bus 0000:01: extended config space not accessible May 16 16:39:42.730739 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 16 16:39:42.730786 kernel: pci_bus 0000:02: extended config space not accessible May 16 16:39:42.730796 kernel: acpiphp: Slot [32] registered May 16 16:39:42.730802 kernel: acpiphp: Slot [33] registered May 16 16:39:42.730807 kernel: acpiphp: Slot [34] registered May 16 16:39:42.730813 kernel: acpiphp: Slot [35] registered May 16 16:39:42.730818 kernel: acpiphp: Slot [36] registered May 16 16:39:42.730824 kernel: acpiphp: Slot [37] registered May 16 16:39:42.730829 kernel: acpiphp: Slot [38] registered May 16 16:39:42.730835 kernel: acpiphp: Slot [39] registered May 16 16:39:42.730840 kernel: acpiphp: Slot [40] registered May 16 16:39:42.730847 kernel: acpiphp: Slot [41] registered May 16 16:39:42.730852 kernel: acpiphp: Slot [42] registered May 16 16:39:42.730858 kernel: acpiphp: Slot [43] registered May 16 16:39:42.730863 kernel: acpiphp: Slot [44] registered May 16 16:39:42.730868 kernel: acpiphp: Slot [45] registered May 16 16:39:42.730873 kernel: acpiphp: Slot [46] registered May 16 16:39:42.730879 kernel: acpiphp: Slot [47] registered May 16 16:39:42.730884 kernel: acpiphp: Slot [48] registered May 16 16:39:42.730889 kernel: acpiphp: Slot [49] registered May 16 16:39:42.730896 kernel: acpiphp: Slot [50] registered May 16 16:39:42.730901 kernel: acpiphp: Slot [51] registered May 16 16:39:42.730907 kernel: acpiphp: Slot [52] registered May 16 16:39:42.730912 kernel: acpiphp: Slot [53] registered May 16 16:39:42.730917 kernel: acpiphp: Slot [54] registered May 16 16:39:42.730923 kernel: acpiphp: Slot [55] registered May 16 16:39:42.730928 kernel: acpiphp: Slot [56] registered May 16 16:39:42.730934 kernel: acpiphp: Slot [57] registered May 16 16:39:42.730939 kernel: acpiphp: Slot [58] registered May 16 16:39:42.730945 kernel: acpiphp: Slot [59] registered May 16 16:39:42.730951 kernel: acpiphp: Slot [60] registered May 16 16:39:42.730957 kernel: acpiphp: Slot [61] registered May 16 16:39:42.730962 kernel: acpiphp: Slot [62] registered May 16 16:39:42.730968 kernel: acpiphp: Slot [63] registered May 16 16:39:42.731012 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) May 16 16:39:42.731072 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) May 16 16:39:42.731139 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) May 16 16:39:42.731198 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) May 16 16:39:42.731244 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) May 16 16:39:42.731289 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) May 16 16:39:42.731341 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 PCIe Endpoint May 16 16:39:42.731388 kernel: pci 0000:03:00.0: BAR 0 [io 0x4000-0x4007] May 16 16:39:42.731451 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfd5f8000-0xfd5fffff 64bit] May 16 16:39:42.731497 kernel: pci 0000:03:00.0: ROM [mem 0x00000000-0x0000ffff pref] May 16 16:39:42.731557 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold May 16 16:39:42.731604 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' May 16 16:39:42.731650 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] May 16 16:39:42.731697 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] May 16 16:39:42.731742 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] May 16 16:39:42.731788 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] May 16 16:39:42.731834 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] May 16 16:39:42.731889 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] May 16 16:39:42.731936 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] May 16 16:39:42.731983 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] May 16 16:39:42.732037 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 PCIe Endpoint May 16 16:39:42.733153 kernel: pci 0000:0b:00.0: BAR 0 [mem 0xfd4fc000-0xfd4fcfff] May 16 16:39:42.733206 kernel: pci 0000:0b:00.0: BAR 1 [mem 0xfd4fd000-0xfd4fdfff] May 16 16:39:42.733255 kernel: pci 0000:0b:00.0: BAR 2 [mem 0xfd4fe000-0xfd4fffff] May 16 16:39:42.733302 kernel: pci 0000:0b:00.0: BAR 3 [io 0x5000-0x500f] May 16 16:39:42.733349 kernel: pci 0000:0b:00.0: ROM [mem 0x00000000-0x0000ffff pref] May 16 16:39:42.733399 kernel: pci 0000:0b:00.0: supports D1 D2 May 16 16:39:42.733446 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold May 16 16:39:42.733493 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' May 16 16:39:42.733540 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] May 16 16:39:42.733587 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] May 16 16:39:42.733633 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] May 16 16:39:42.733680 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] May 16 16:39:42.733728 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] May 16 16:39:42.733775 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] May 16 16:39:42.733827 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] May 16 16:39:42.733875 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] May 16 16:39:42.733922 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] May 16 16:39:42.733979 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] May 16 16:39:42.734029 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] May 16 16:39:42.735123 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] May 16 16:39:42.735190 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] May 16 16:39:42.735237 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] May 16 16:39:42.735282 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] May 16 16:39:42.735328 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] May 16 16:39:42.735374 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] May 16 16:39:42.735419 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] May 16 16:39:42.735465 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] May 16 16:39:42.735513 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] May 16 16:39:42.735558 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] May 16 16:39:42.735603 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] May 16 16:39:42.735649 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] May 16 16:39:42.735695 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] May 16 16:39:42.735703 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 May 16 16:39:42.735709 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 May 16 16:39:42.735714 kernel: ACPI: PCI: Interrupt link LNKB disabled May 16 16:39:42.735721 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 16 16:39:42.735727 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 May 16 16:39:42.735733 kernel: iommu: Default domain type: Translated May 16 16:39:42.735738 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 16 16:39:42.735744 kernel: PCI: Using ACPI for IRQ routing May 16 16:39:42.735749 kernel: PCI: pci_cache_line_size set to 64 bytes May 16 16:39:42.735755 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] May 16 16:39:42.735761 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] May 16 16:39:42.735804 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device May 16 16:39:42.735851 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible May 16 16:39:42.735896 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 16 16:39:42.735904 kernel: vgaarb: loaded May 16 16:39:42.735910 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 May 16 16:39:42.735916 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter May 16 16:39:42.735921 kernel: clocksource: Switched to clocksource tsc-early May 16 16:39:42.735927 kernel: VFS: Disk quotas dquot_6.6.0 May 16 16:39:42.735932 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 16 16:39:42.735938 kernel: pnp: PnP ACPI init May 16 16:39:42.735985 kernel: system 00:00: [io 0x1000-0x103f] has been reserved May 16 16:39:42.736028 kernel: system 00:00: [io 0x1040-0x104f] has been reserved May 16 16:39:42.737127 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved May 16 16:39:42.737194 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved May 16 16:39:42.737241 kernel: pnp 00:06: [dma 2] May 16 16:39:42.737290 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved May 16 16:39:42.737335 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved May 16 16:39:42.737376 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved May 16 16:39:42.737384 kernel: pnp: PnP ACPI: found 8 devices May 16 16:39:42.737391 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 16 16:39:42.737396 kernel: NET: Registered PF_INET protocol family May 16 16:39:42.737402 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) May 16 16:39:42.737408 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) May 16 16:39:42.737414 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 16 16:39:42.737421 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) May 16 16:39:42.737427 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) May 16 16:39:42.737432 kernel: TCP: Hash tables configured (established 16384 bind 16384) May 16 16:39:42.737438 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) May 16 16:39:42.737444 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) May 16 16:39:42.737449 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 16 16:39:42.737455 kernel: NET: Registered PF_XDP protocol family May 16 16:39:42.737503 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 16 16:39:42.737552 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 May 16 16:39:42.737601 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 16 16:39:42.737647 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 16 16:39:42.737694 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 16 16:39:42.737739 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 May 16 16:39:42.737785 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 May 16 16:39:42.737831 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 May 16 16:39:42.737876 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 May 16 16:39:42.737923 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 May 16 16:39:42.737969 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 May 16 16:39:42.738014 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 May 16 16:39:42.738071 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 May 16 16:39:42.738121 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 May 16 16:39:42.738166 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 May 16 16:39:42.738211 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 May 16 16:39:42.738257 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 May 16 16:39:42.738305 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 May 16 16:39:42.738351 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 May 16 16:39:42.738396 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 May 16 16:39:42.738441 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 May 16 16:39:42.738487 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 May 16 16:39:42.738532 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 May 16 16:39:42.738577 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref]: assigned May 16 16:39:42.738622 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref]: assigned May 16 16:39:42.738670 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.738716 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.738760 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.738806 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.738851 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.738896 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.738941 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.738986 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739034 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739095 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739142 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739187 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739233 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739277 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739322 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739369 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739414 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739458 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739502 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739548 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739593 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739637 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739681 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739728 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739773 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739817 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739862 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739907 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.739952 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.739996 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740041 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740113 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740172 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740217 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740262 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740307 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740352 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740397 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740442 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740489 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740534 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740579 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740623 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740669 kernel: pci 0000:00:18.7: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740715 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740759 kernel: pci 0000:00:18.6: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740821 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740866 kernel: pci 0000:00:18.5: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.740914 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.740972 kernel: pci 0000:00:18.4: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.741015 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.741710 kernel: pci 0000:00:18.3: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.741759 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.741806 kernel: pci 0000:00:18.2: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.741852 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.741916 kernel: pci 0000:00:17.7: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.741961 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742023 kernel: pci 0000:00:17.6: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742087 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742135 kernel: pci 0000:00:17.5: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742181 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742226 kernel: pci 0000:00:17.4: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742271 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742316 kernel: pci 0000:00:17.3: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742361 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742406 kernel: pci 0000:00:16.7: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742450 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742499 kernel: pci 0000:00:16.6: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742543 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742588 kernel: pci 0000:00:16.5: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742632 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742677 kernel: pci 0000:00:16.4: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742722 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742767 kernel: pci 0000:00:16.3: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742814 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742860 kernel: pci 0000:00:15.7: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742905 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.742950 kernel: pci 0000:00:15.6: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.742995 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.743039 kernel: pci 0000:00:15.5: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.743122 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.743197 kernel: pci 0000:00:15.4: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.743245 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: can't assign; no space May 16 16:39:42.743290 kernel: pci 0000:00:15.3: bridge window [io size 0x1000]: failed to assign May 16 16:39:42.743335 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 16 16:39:42.743380 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] May 16 16:39:42.743425 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] May 16 16:39:42.743471 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] May 16 16:39:42.743516 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] May 16 16:39:42.743563 kernel: pci 0000:03:00.0: ROM [mem 0xfd500000-0xfd50ffff pref]: assigned May 16 16:39:42.743611 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] May 16 16:39:42.743656 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] May 16 16:39:42.743701 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] May 16 16:39:42.743746 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] May 16 16:39:42.743792 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] May 16 16:39:42.743836 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] May 16 16:39:42.743881 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] May 16 16:39:42.743927 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] May 16 16:39:42.743989 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] May 16 16:39:42.744035 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] May 16 16:39:42.744098 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] May 16 16:39:42.744146 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] May 16 16:39:42.744192 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] May 16 16:39:42.744239 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] May 16 16:39:42.744287 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] May 16 16:39:42.744349 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] May 16 16:39:42.744442 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] May 16 16:39:42.744489 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] May 16 16:39:42.744554 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] May 16 16:39:42.744601 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] May 16 16:39:42.744648 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] May 16 16:39:42.744695 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] May 16 16:39:42.744741 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] May 16 16:39:42.744788 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] May 16 16:39:42.744835 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] May 16 16:39:42.744900 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] May 16 16:39:42.744946 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] May 16 16:39:42.744994 kernel: pci 0000:0b:00.0: ROM [mem 0xfd400000-0xfd40ffff pref]: assigned May 16 16:39:42.745040 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] May 16 16:39:42.745155 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] May 16 16:39:42.745216 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] May 16 16:39:42.745262 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] May 16 16:39:42.745308 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] May 16 16:39:42.745353 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] May 16 16:39:42.745401 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] May 16 16:39:42.745446 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] May 16 16:39:42.745493 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] May 16 16:39:42.745539 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] May 16 16:39:42.745584 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] May 16 16:39:42.745630 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] May 16 16:39:42.745676 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] May 16 16:39:42.745720 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] May 16 16:39:42.745785 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] May 16 16:39:42.745849 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] May 16 16:39:42.745894 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] May 16 16:39:42.745939 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] May 16 16:39:42.745985 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] May 16 16:39:42.746030 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] May 16 16:39:42.746087 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] May 16 16:39:42.746134 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] May 16 16:39:42.746181 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] May 16 16:39:42.746228 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] May 16 16:39:42.746292 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] May 16 16:39:42.746351 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] May 16 16:39:42.746397 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] May 16 16:39:42.746443 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] May 16 16:39:42.746488 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] May 16 16:39:42.746533 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] May 16 16:39:42.746580 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] May 16 16:39:42.746627 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] May 16 16:39:42.746672 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] May 16 16:39:42.746718 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] May 16 16:39:42.746764 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] May 16 16:39:42.746810 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] May 16 16:39:42.746856 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] May 16 16:39:42.746900 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] May 16 16:39:42.746946 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] May 16 16:39:42.746994 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] May 16 16:39:42.747039 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] May 16 16:39:42.747099 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] May 16 16:39:42.747148 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] May 16 16:39:42.747195 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] May 16 16:39:42.747242 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] May 16 16:39:42.747288 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] May 16 16:39:42.747333 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] May 16 16:39:42.747383 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] May 16 16:39:42.747429 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] May 16 16:39:42.747474 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] May 16 16:39:42.747519 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] May 16 16:39:42.747566 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] May 16 16:39:42.747611 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] May 16 16:39:42.747656 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] May 16 16:39:42.747705 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] May 16 16:39:42.747751 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] May 16 16:39:42.747796 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] May 16 16:39:42.747842 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] May 16 16:39:42.747888 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] May 16 16:39:42.747934 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] May 16 16:39:42.747980 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] May 16 16:39:42.748025 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] May 16 16:39:42.748081 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] May 16 16:39:42.748129 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] May 16 16:39:42.748175 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] May 16 16:39:42.748221 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] May 16 16:39:42.748266 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] May 16 16:39:42.748312 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] May 16 16:39:42.748358 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] May 16 16:39:42.748404 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] May 16 16:39:42.748449 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] May 16 16:39:42.748497 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] May 16 16:39:42.748543 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] May 16 16:39:42.748588 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] May 16 16:39:42.748633 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] May 16 16:39:42.748679 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] May 16 16:39:42.748723 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] May 16 16:39:42.748769 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] May 16 16:39:42.748816 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] May 16 16:39:42.748862 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] May 16 16:39:42.748905 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] May 16 16:39:42.748945 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] May 16 16:39:42.748985 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] May 16 16:39:42.749025 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] May 16 16:39:42.749074 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] May 16 16:39:42.749124 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] May 16 16:39:42.749167 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] May 16 16:39:42.749209 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] May 16 16:39:42.749250 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] May 16 16:39:42.749291 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] May 16 16:39:42.749336 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] May 16 16:39:42.749378 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] May 16 16:39:42.749422 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] May 16 16:39:42.749468 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] May 16 16:39:42.749511 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] May 16 16:39:42.749552 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] May 16 16:39:42.749597 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] May 16 16:39:42.749640 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] May 16 16:39:42.749681 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] May 16 16:39:42.749728 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] May 16 16:39:42.749770 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] May 16 16:39:42.749812 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] May 16 16:39:42.749859 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] May 16 16:39:42.749901 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] May 16 16:39:42.749947 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] May 16 16:39:42.749989 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] May 16 16:39:42.750035 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] May 16 16:39:42.750087 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] May 16 16:39:42.750133 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] May 16 16:39:42.750174 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] May 16 16:39:42.750221 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] May 16 16:39:42.750264 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] May 16 16:39:42.750310 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] May 16 16:39:42.750352 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] May 16 16:39:42.750394 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] May 16 16:39:42.750438 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] May 16 16:39:42.750480 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] May 16 16:39:42.750524 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] May 16 16:39:42.750569 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] May 16 16:39:42.750611 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] May 16 16:39:42.750652 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] May 16 16:39:42.750698 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] May 16 16:39:42.750741 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] May 16 16:39:42.750787 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] May 16 16:39:42.750831 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] May 16 16:39:42.750877 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] May 16 16:39:42.750919 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] May 16 16:39:42.750965 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] May 16 16:39:42.751007 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] May 16 16:39:42.751061 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] May 16 16:39:42.751123 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] May 16 16:39:42.751169 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] May 16 16:39:42.751210 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] May 16 16:39:42.751251 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] May 16 16:39:42.751295 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] May 16 16:39:42.751336 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] May 16 16:39:42.751377 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] May 16 16:39:42.751423 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] May 16 16:39:42.751465 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] May 16 16:39:42.751505 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] May 16 16:39:42.751549 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] May 16 16:39:42.751590 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] May 16 16:39:42.751636 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] May 16 16:39:42.751679 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] May 16 16:39:42.751741 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] May 16 16:39:42.751797 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] May 16 16:39:42.751841 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] May 16 16:39:42.751883 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] May 16 16:39:42.751928 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] May 16 16:39:42.751969 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] May 16 16:39:42.752015 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] May 16 16:39:42.752069 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] May 16 16:39:42.752132 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] May 16 16:39:42.752179 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] May 16 16:39:42.752222 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] May 16 16:39:42.752263 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] May 16 16:39:42.752310 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] May 16 16:39:42.752353 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] May 16 16:39:42.752399 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] May 16 16:39:42.752441 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] May 16 16:39:42.752485 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] May 16 16:39:42.752527 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] May 16 16:39:42.752573 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] May 16 16:39:42.752617 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] May 16 16:39:42.752662 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] May 16 16:39:42.752704 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] May 16 16:39:42.752750 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] May 16 16:39:42.752791 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] May 16 16:39:42.752840 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 16 16:39:42.752850 kernel: PCI: CLS 32 bytes, default 64 May 16 16:39:42.752856 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer May 16 16:39:42.752862 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns May 16 16:39:42.752868 kernel: clocksource: Switched to clocksource tsc May 16 16:39:42.752874 kernel: Initialise system trusted keyrings May 16 16:39:42.752879 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 May 16 16:39:42.752903 kernel: Key type asymmetric registered May 16 16:39:42.752908 kernel: Asymmetric key parser 'x509' registered May 16 16:39:42.752915 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 16 16:39:42.752921 kernel: io scheduler mq-deadline registered May 16 16:39:42.752927 kernel: io scheduler kyber registered May 16 16:39:42.752933 kernel: io scheduler bfq registered May 16 16:39:42.752980 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 May 16 16:39:42.753029 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753112 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 May 16 16:39:42.753175 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753225 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 May 16 16:39:42.753271 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753317 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 May 16 16:39:42.753363 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753409 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 May 16 16:39:42.753456 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753502 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 May 16 16:39:42.753550 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753596 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 May 16 16:39:42.753643 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753688 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 May 16 16:39:42.753734 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753780 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 May 16 16:39:42.753826 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753876 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 May 16 16:39:42.753921 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.753967 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 May 16 16:39:42.754013 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754073 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 May 16 16:39:42.754130 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754178 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 May 16 16:39:42.754224 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754273 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 May 16 16:39:42.754319 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754364 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 May 16 16:39:42.754411 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754457 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 May 16 16:39:42.754503 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754549 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 May 16 16:39:42.754597 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754644 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 May 16 16:39:42.754689 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754735 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 May 16 16:39:42.754783 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754829 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 May 16 16:39:42.754875 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.754921 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 May 16 16:39:42.754969 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.755018 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 May 16 16:39:42.755074 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.755178 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 May 16 16:39:42.755224 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.755271 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 May 16 16:39:42.757002 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757067 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 May 16 16:39:42.757117 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757163 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 May 16 16:39:42.757210 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757257 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 May 16 16:39:42.757303 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757349 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 May 16 16:39:42.757394 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757442 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 May 16 16:39:42.757488 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757533 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 May 16 16:39:42.757578 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757623 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 May 16 16:39:42.757670 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757714 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 May 16 16:39:42.757762 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 16 16:39:42.757773 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 16 16:39:42.757779 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 16 16:39:42.757785 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 16 16:39:42.757791 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 May 16 16:39:42.757797 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 16 16:39:42.757802 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 16 16:39:42.757847 kernel: rtc_cmos 00:01: registered as rtc0 May 16 16:39:42.757897 kernel: rtc_cmos 00:01: setting system clock to 2025-05-16T16:39:42 UTC (1747413582) May 16 16:39:42.757940 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram May 16 16:39:42.757949 kernel: intel_pstate: CPU model not supported May 16 16:39:42.757955 kernel: NET: Registered PF_INET6 protocol family May 16 16:39:42.757961 kernel: Segment Routing with IPv6 May 16 16:39:42.757967 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 May 16 16:39:42.757973 kernel: In-situ OAM (IOAM) with IPv6 May 16 16:39:42.757979 kernel: NET: Registered PF_PACKET protocol family May 16 16:39:42.757986 kernel: Key type dns_resolver registered May 16 16:39:42.757992 kernel: IPI shorthand broadcast: enabled May 16 16:39:42.757998 kernel: sched_clock: Marking stable (2370003479, 173291357)->(2560254351, -16959515) May 16 16:39:42.758004 kernel: registered taskstats version 1 May 16 16:39:42.758010 kernel: Loading compiled-in X.509 certificates May 16 16:39:42.758016 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.20-flatcar: 310304ddc2cf6c43796c9bf79d11c0543afdf71f' May 16 16:39:42.758022 kernel: Demotion targets for Node 0: null May 16 16:39:42.758028 kernel: Key type .fscrypt registered May 16 16:39:42.758034 kernel: Key type fscrypt-provisioning registered May 16 16:39:42.758040 kernel: ima: No TPM chip found, activating TPM-bypass! May 16 16:39:42.758046 kernel: ima: Allocated hash algorithm: sha1 May 16 16:39:42.758060 kernel: ima: No architecture policies found May 16 16:39:42.758066 kernel: clk: Disabling unused clocks May 16 16:39:42.758073 kernel: Warning: unable to open an initial console. May 16 16:39:42.758079 kernel: Freeing unused kernel image (initmem) memory: 54416K May 16 16:39:42.758085 kernel: Write protecting the kernel read-only data: 24576k May 16 16:39:42.758090 kernel: Freeing unused kernel image (rodata/data gap) memory: 296K May 16 16:39:42.758098 kernel: Run /init as init process May 16 16:39:42.758104 kernel: with arguments: May 16 16:39:42.758111 kernel: /init May 16 16:39:42.758117 kernel: with environment: May 16 16:39:42.758122 kernel: HOME=/ May 16 16:39:42.758128 kernel: TERM=linux May 16 16:39:42.758133 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 16 16:39:42.758140 systemd[1]: Successfully made /usr/ read-only. May 16 16:39:42.758149 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 16 16:39:42.758175 systemd[1]: Detected virtualization vmware. May 16 16:39:42.758181 systemd[1]: Detected architecture x86-64. May 16 16:39:42.758187 systemd[1]: Running in initrd. May 16 16:39:42.758193 systemd[1]: No hostname configured, using default hostname. May 16 16:39:42.758200 systemd[1]: Hostname set to . May 16 16:39:42.758206 systemd[1]: Initializing machine ID from random generator. May 16 16:39:42.758212 systemd[1]: Queued start job for default target initrd.target. May 16 16:39:42.758218 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 16:39:42.758225 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 16:39:42.758232 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 16 16:39:42.758238 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 16 16:39:42.758244 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 16 16:39:42.758250 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 16 16:39:42.758257 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 16 16:39:42.758264 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 16 16:39:42.758271 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 16:39:42.758277 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 16 16:39:42.758283 systemd[1]: Reached target paths.target - Path Units. May 16 16:39:42.758289 systemd[1]: Reached target slices.target - Slice Units. May 16 16:39:42.758295 systemd[1]: Reached target swap.target - Swaps. May 16 16:39:42.758301 systemd[1]: Reached target timers.target - Timer Units. May 16 16:39:42.758308 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 16 16:39:42.758314 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 16 16:39:42.758322 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 16 16:39:42.758328 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 16 16:39:42.758334 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 16 16:39:42.758340 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 16 16:39:42.758346 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 16 16:39:42.758352 systemd[1]: Reached target sockets.target - Socket Units. May 16 16:39:42.758358 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 16 16:39:42.758364 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 16 16:39:42.758370 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 16 16:39:42.758378 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 16 16:39:42.758384 systemd[1]: Starting systemd-fsck-usr.service... May 16 16:39:42.758390 systemd[1]: Starting systemd-journald.service - Journal Service... May 16 16:39:42.758396 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 16 16:39:42.758402 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 16:39:42.758408 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 16 16:39:42.758415 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 16 16:39:42.758421 systemd[1]: Finished systemd-fsck-usr.service. May 16 16:39:42.758428 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 16 16:39:42.758446 systemd-journald[245]: Collecting audit messages is disabled. May 16 16:39:42.758464 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 16 16:39:42.758470 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 16 16:39:42.758477 kernel: Bridge firewalling registered May 16 16:39:42.758483 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 16 16:39:42.758489 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:39:42.758495 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 16 16:39:42.758502 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 16 16:39:42.758508 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 16 16:39:42.758515 systemd-journald[245]: Journal started May 16 16:39:42.758529 systemd-journald[245]: Runtime Journal (/run/log/journal/81326d4fcf8a4422980dad5491cad09f) is 4.8M, max 38.8M, 34M free. May 16 16:39:42.709196 systemd-modules-load[247]: Inserted module 'overlay' May 16 16:39:42.734075 systemd-modules-load[247]: Inserted module 'br_netfilter' May 16 16:39:42.760241 systemd[1]: Started systemd-journald.service - Journal Service. May 16 16:39:42.763461 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 16 16:39:42.765895 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 16:39:42.766490 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 16 16:39:42.767471 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 16 16:39:42.768251 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 16 16:39:42.774409 systemd-tmpfiles[275]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 16 16:39:42.776421 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 16:39:42.778128 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 16 16:39:42.779942 dracut-cmdline[281]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e3be1f8a550c199f4f838f30cb661b44d98bde818b7f263cba125cc457a9c137 May 16 16:39:42.802222 systemd-resolved[293]: Positive Trust Anchors: May 16 16:39:42.802230 systemd-resolved[293]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 16 16:39:42.802251 systemd-resolved[293]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 16 16:39:42.804103 systemd-resolved[293]: Defaulting to hostname 'linux'. May 16 16:39:42.804704 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 16 16:39:42.804919 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 16 16:39:42.828071 kernel: SCSI subsystem initialized May 16 16:39:42.834060 kernel: Loading iSCSI transport class v2.0-870. May 16 16:39:42.841062 kernel: iscsi: registered transport (tcp) May 16 16:39:42.854060 kernel: iscsi: registered transport (qla4xxx) May 16 16:39:42.854076 kernel: QLogic iSCSI HBA Driver May 16 16:39:42.863568 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 16 16:39:42.876168 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 16 16:39:42.877318 systemd[1]: Reached target network-pre.target - Preparation for Network. May 16 16:39:42.898479 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 16 16:39:42.899207 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 16 16:39:42.939112 kernel: raid6: avx2x4 gen() 49751 MB/s May 16 16:39:42.955094 kernel: raid6: avx2x2 gen() 54222 MB/s May 16 16:39:42.972252 kernel: raid6: avx2x1 gen() 46686 MB/s May 16 16:39:42.972274 kernel: raid6: using algorithm avx2x2 gen() 54222 MB/s May 16 16:39:42.990261 kernel: raid6: .... xor() 33524 MB/s, rmw enabled May 16 16:39:42.990283 kernel: raid6: using avx2x2 recovery algorithm May 16 16:39:43.003060 kernel: xor: automatically using best checksumming function avx May 16 16:39:43.099066 kernel: Btrfs loaded, zoned=no, fsverity=no May 16 16:39:43.101777 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 16 16:39:43.102780 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 16:39:43.117448 systemd-udevd[494]: Using default interface naming scheme 'v255'. May 16 16:39:43.120613 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 16:39:43.121480 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 16 16:39:43.136088 dracut-pre-trigger[500]: rd.md=0: removing MD RAID activation May 16 16:39:43.148326 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 16 16:39:43.149124 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 16 16:39:43.217459 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 16 16:39:43.219004 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 16 16:39:43.278062 kernel: VMware PVSCSI driver - version 1.0.7.0-k May 16 16:39:43.281062 kernel: vmw_pvscsi: using 64bit dma May 16 16:39:43.285148 kernel: vmw_pvscsi: max_id: 16 May 16 16:39:43.285164 kernel: vmw_pvscsi: setting ring_pages to 8 May 16 16:39:43.286061 kernel: VMware vmxnet3 virtual NIC driver - version 1.9.0.0-k-NAPI May 16 16:39:43.296164 kernel: vmw_pvscsi: enabling reqCallThreshold May 16 16:39:43.296183 kernel: vmw_pvscsi: driver-based request coalescing enabled May 16 16:39:43.296192 kernel: vmw_pvscsi: using MSI-X May 16 16:39:43.296199 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 May 16 16:39:43.296220 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 May 16 16:39:43.309455 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps May 16 16:39:43.309532 kernel: cryptd: max_cpu_qlen set to 1000 May 16 16:39:43.309541 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 May 16 16:39:43.313270 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 May 16 16:39:43.313351 kernel: libata version 3.00 loaded. May 16 16:39:43.315903 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 16 16:39:43.317350 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:39:43.318079 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 May 16 16:39:43.317622 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 16 16:39:43.318404 (udev-worker)[543]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. May 16 16:39:43.319196 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 16:39:43.326094 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 May 16 16:39:43.326194 kernel: ata_piix 0000:00:07.1: version 2.13 May 16 16:39:43.330629 kernel: scsi host1: ata_piix May 16 16:39:43.330701 kernel: scsi host2: ata_piix May 16 16:39:43.330759 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 lpm-pol 0 May 16 16:39:43.330768 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 lpm-pol 0 May 16 16:39:43.331066 kernel: AES CTR mode by8 optimization enabled May 16 16:39:43.333058 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) May 16 16:39:43.356550 kernel: sd 0:0:0:0: [sda] Write Protect is off May 16 16:39:43.356620 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 May 16 16:39:43.356678 kernel: sd 0:0:0:0: [sda] Cache data unavailable May 16 16:39:43.356734 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through May 16 16:39:43.356790 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 16 16:39:43.356798 kernel: sd 0:0:0:0: [sda] Attached SCSI disk May 16 16:39:43.348587 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:39:43.495127 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 May 16 16:39:43.499060 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 May 16 16:39:43.525175 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray May 16 16:39:43.535502 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 16 16:39:43.535517 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 16 16:39:43.551034 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. May 16 16:39:43.556279 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. May 16 16:39:43.561464 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. May 16 16:39:43.565631 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. May 16 16:39:43.565745 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. May 16 16:39:43.566371 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 16 16:39:43.604071 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 16 16:39:43.614074 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 16 16:39:43.795671 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 16 16:39:43.796109 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 16 16:39:43.796295 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 16:39:43.796562 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 16 16:39:43.797389 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 16 16:39:43.811911 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 16 16:39:44.613969 disk-uuid[648]: The operation has completed successfully. May 16 16:39:44.614289 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 16 16:39:44.649275 systemd[1]: disk-uuid.service: Deactivated successfully. May 16 16:39:44.649335 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 16 16:39:44.664751 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 16 16:39:44.676741 sh[678]: Success May 16 16:39:44.688404 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 16 16:39:44.688441 kernel: device-mapper: uevent: version 1.0.3 May 16 16:39:44.689729 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 16 16:39:44.697152 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" May 16 16:39:44.739850 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 16 16:39:44.741071 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 16 16:39:44.753405 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 16 16:39:44.766062 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 16 16:39:44.766097 kernel: BTRFS: device fsid 85b2a34c-237f-4a0a-87d0-0a783de0f256 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (690) May 16 16:39:44.767558 kernel: BTRFS info (device dm-0): first mount of filesystem 85b2a34c-237f-4a0a-87d0-0a783de0f256 May 16 16:39:44.767578 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 16 16:39:44.769210 kernel: BTRFS info (device dm-0): using free-space-tree May 16 16:39:44.777022 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 16 16:39:44.777371 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 16 16:39:44.777972 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... May 16 16:39:44.779112 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 16 16:39:44.801129 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (713) May 16 16:39:44.801165 kernel: BTRFS info (device sda6): first mount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:39:44.803247 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 16 16:39:44.803265 kernel: BTRFS info (device sda6): using free-space-tree May 16 16:39:44.812228 kernel: BTRFS info (device sda6): last unmount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:39:44.812448 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 16 16:39:44.816582 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 16 16:39:44.861745 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. May 16 16:39:44.862551 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 16 16:39:44.939425 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 16 16:39:44.940383 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 16 16:39:44.946593 ignition[732]: Ignition 2.21.0 May 16 16:39:44.946770 ignition[732]: Stage: fetch-offline May 16 16:39:44.946788 ignition[732]: no configs at "/usr/lib/ignition/base.d" May 16 16:39:44.946792 ignition[732]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 16 16:39:44.946836 ignition[732]: parsed url from cmdline: "" May 16 16:39:44.946838 ignition[732]: no config URL provided May 16 16:39:44.946841 ignition[732]: reading system config file "/usr/lib/ignition/user.ign" May 16 16:39:44.946844 ignition[732]: no config at "/usr/lib/ignition/user.ign" May 16 16:39:44.947359 ignition[732]: config successfully fetched May 16 16:39:44.947376 ignition[732]: parsing config with SHA512: 96e501182e2df582dd813069de84cf99ec35d28ca70b6e4877ef284977dc14ad30d16fbc91b9cb9c6b34216879adc15a3aa32b1e59b6a51fbfbd772db805686a May 16 16:39:44.950911 unknown[732]: fetched base config from "system" May 16 16:39:44.951455 unknown[732]: fetched user config from "vmware" May 16 16:39:44.951659 ignition[732]: fetch-offline: fetch-offline passed May 16 16:39:44.951693 ignition[732]: Ignition finished successfully May 16 16:39:44.953964 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 16 16:39:44.966146 systemd-networkd[869]: lo: Link UP May 16 16:39:44.966369 systemd-networkd[869]: lo: Gained carrier May 16 16:39:44.967176 systemd-networkd[869]: Enumeration completed May 16 16:39:44.967361 systemd[1]: Started systemd-networkd.service - Network Configuration. May 16 16:39:44.967513 systemd-networkd[869]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. May 16 16:39:44.967581 systemd[1]: Reached target network.target - Network. May 16 16:39:44.967703 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 16 16:39:44.970625 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated May 16 16:39:44.970744 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps May 16 16:39:44.969163 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 16 16:39:44.971418 systemd-networkd[869]: ens192: Link UP May 16 16:39:44.971420 systemd-networkd[869]: ens192: Gained carrier May 16 16:39:44.984120 ignition[873]: Ignition 2.21.0 May 16 16:39:44.984388 ignition[873]: Stage: kargs May 16 16:39:44.984564 ignition[873]: no configs at "/usr/lib/ignition/base.d" May 16 16:39:44.984678 ignition[873]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 16 16:39:44.985280 ignition[873]: kargs: kargs passed May 16 16:39:44.985399 ignition[873]: Ignition finished successfully May 16 16:39:44.986650 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 16 16:39:44.987459 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 16 16:39:45.005707 ignition[880]: Ignition 2.21.0 May 16 16:39:45.005718 ignition[880]: Stage: disks May 16 16:39:45.005799 ignition[880]: no configs at "/usr/lib/ignition/base.d" May 16 16:39:45.005805 ignition[880]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 16 16:39:45.006218 ignition[880]: disks: disks passed May 16 16:39:45.006243 ignition[880]: Ignition finished successfully May 16 16:39:45.007103 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 16 16:39:45.007433 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 16 16:39:45.007563 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 16 16:39:45.007737 systemd[1]: Reached target local-fs.target - Local File Systems. May 16 16:39:45.007907 systemd[1]: Reached target sysinit.target - System Initialization. May 16 16:39:45.008096 systemd[1]: Reached target basic.target - Basic System. May 16 16:39:45.008743 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 16 16:39:45.030579 systemd-fsck[889]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 16 16:39:45.031678 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 16 16:39:45.032305 systemd[1]: Mounting sysroot.mount - /sysroot... May 16 16:39:45.102060 kernel: EXT4-fs (sda9): mounted filesystem 07293137-138a-42a3-a962-d767034e11a7 r/w with ordered data mode. Quota mode: none. May 16 16:39:45.102247 systemd[1]: Mounted sysroot.mount - /sysroot. May 16 16:39:45.102698 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 16 16:39:45.103657 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 16 16:39:45.105104 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 16 16:39:45.105497 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 16 16:39:45.105522 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 16 16:39:45.105536 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 16 16:39:45.110758 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 16 16:39:45.111675 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 16 16:39:45.117110 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (897) May 16 16:39:45.120524 kernel: BTRFS info (device sda6): first mount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:39:45.120545 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 16 16:39:45.120553 kernel: BTRFS info (device sda6): using free-space-tree May 16 16:39:45.125840 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 16 16:39:45.138189 initrd-setup-root[921]: cut: /sysroot/etc/passwd: No such file or directory May 16 16:39:45.140699 initrd-setup-root[928]: cut: /sysroot/etc/group: No such file or directory May 16 16:39:45.143147 initrd-setup-root[935]: cut: /sysroot/etc/shadow: No such file or directory May 16 16:39:45.145354 initrd-setup-root[942]: cut: /sysroot/etc/gshadow: No such file or directory May 16 16:39:45.196471 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 16 16:39:45.197247 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 16 16:39:45.198120 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 16 16:39:45.207061 kernel: BTRFS info (device sda6): last unmount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:39:45.222256 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 16 16:39:45.224012 ignition[1014]: INFO : Ignition 2.21.0 May 16 16:39:45.224012 ignition[1014]: INFO : Stage: mount May 16 16:39:45.224359 ignition[1014]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 16:39:45.224359 ignition[1014]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 16 16:39:45.226200 ignition[1014]: INFO : mount: mount passed May 16 16:39:45.226750 ignition[1014]: INFO : Ignition finished successfully May 16 16:39:45.227009 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 16 16:39:45.227744 systemd[1]: Starting ignition-files.service - Ignition (files)... May 16 16:39:45.316074 systemd-resolved[293]: Detected conflict on linux IN A 139.178.70.106 May 16 16:39:45.316365 systemd-resolved[293]: Hostname conflict, changing published hostname from 'linux' to 'linux11'. May 16 16:39:45.764578 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 16 16:39:45.766312 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 16 16:39:45.783067 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (1026) May 16 16:39:45.786117 kernel: BTRFS info (device sda6): first mount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:39:45.786141 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 16 16:39:45.786157 kernel: BTRFS info (device sda6): using free-space-tree May 16 16:39:45.790781 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 16 16:39:45.811865 ignition[1042]: INFO : Ignition 2.21.0 May 16 16:39:45.811865 ignition[1042]: INFO : Stage: files May 16 16:39:45.812245 ignition[1042]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 16:39:45.812245 ignition[1042]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 16 16:39:45.812541 ignition[1042]: DEBUG : files: compiled without relabeling support, skipping May 16 16:39:45.813273 ignition[1042]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 16 16:39:45.813273 ignition[1042]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 16 16:39:45.814641 ignition[1042]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 16 16:39:45.814835 ignition[1042]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 16 16:39:45.815116 unknown[1042]: wrote ssh authorized keys file for user: core May 16 16:39:45.815299 ignition[1042]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 16 16:39:45.816860 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 16 16:39:45.817035 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 16 16:39:45.861358 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 16 16:39:45.983331 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 16 16:39:45.983579 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 16 16:39:45.983579 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 16 16:39:45.983579 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 16 16:39:45.983579 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 16 16:39:45.983579 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 16 16:39:45.984328 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 16 16:39:45.984328 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 16 16:39:45.984328 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 16 16:39:45.984771 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 16 16:39:45.984918 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 16 16:39:45.984918 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 16 16:39:45.987042 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 16 16:39:45.987248 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 16 16:39:45.987248 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 May 16 16:39:46.666216 systemd-networkd[869]: ens192: Gained IPv6LL May 16 16:39:47.037887 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 16 16:39:47.345847 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 16 16:39:47.345847 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" May 16 16:39:47.346787 ignition[1042]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" May 16 16:39:47.346787 ignition[1042]: INFO : files: op(c): [started] processing unit "prepare-helm.service" May 16 16:39:47.347374 ignition[1042]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 16 16:39:47.347971 ignition[1042]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 16 16:39:47.347971 ignition[1042]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" May 16 16:39:47.347971 ignition[1042]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" May 16 16:39:47.347971 ignition[1042]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 16 16:39:47.347971 ignition[1042]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 16 16:39:47.347971 ignition[1042]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" May 16 16:39:47.347971 ignition[1042]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" May 16 16:39:47.372146 ignition[1042]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" May 16 16:39:47.374387 ignition[1042]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 16 16:39:47.374387 ignition[1042]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" May 16 16:39:47.374387 ignition[1042]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" May 16 16:39:47.374387 ignition[1042]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" May 16 16:39:47.375570 ignition[1042]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" May 16 16:39:47.375570 ignition[1042]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" May 16 16:39:47.375570 ignition[1042]: INFO : files: files passed May 16 16:39:47.375570 ignition[1042]: INFO : Ignition finished successfully May 16 16:39:47.376199 systemd[1]: Finished ignition-files.service - Ignition (files). May 16 16:39:47.377014 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 16 16:39:47.379138 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 16 16:39:47.387931 systemd[1]: ignition-quench.service: Deactivated successfully. May 16 16:39:47.387993 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 16 16:39:47.390304 initrd-setup-root-after-ignition[1074]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 16 16:39:47.390304 initrd-setup-root-after-ignition[1074]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 16 16:39:47.391059 initrd-setup-root-after-ignition[1078]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 16 16:39:47.391694 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 16 16:39:47.391981 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 16 16:39:47.392578 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 16 16:39:47.416124 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 16 16:39:47.416196 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 16 16:39:47.416459 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 16 16:39:47.416571 systemd[1]: Reached target initrd.target - Initrd Default Target. May 16 16:39:47.416766 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 16 16:39:47.417228 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 16 16:39:47.432080 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 16 16:39:47.432958 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 16 16:39:47.442523 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 16 16:39:47.442787 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 16:39:47.443132 systemd[1]: Stopped target timers.target - Timer Units. May 16 16:39:47.443359 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 16 16:39:47.443426 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 16 16:39:47.443914 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 16 16:39:47.444196 systemd[1]: Stopped target basic.target - Basic System. May 16 16:39:47.444411 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 16 16:39:47.444700 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 16 16:39:47.444971 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 16 16:39:47.445223 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 16 16:39:47.445505 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 16 16:39:47.445765 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 16 16:39:47.446034 systemd[1]: Stopped target sysinit.target - System Initialization. May 16 16:39:47.446319 systemd[1]: Stopped target local-fs.target - Local File Systems. May 16 16:39:47.446545 systemd[1]: Stopped target swap.target - Swaps. May 16 16:39:47.446775 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 16 16:39:47.446937 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 16 16:39:47.447282 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 16 16:39:47.447420 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 16:39:47.447540 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 16 16:39:47.447939 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 16:39:47.448192 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 16 16:39:47.448258 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 16 16:39:47.448692 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 16 16:39:47.448858 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 16 16:39:47.449166 systemd[1]: Stopped target paths.target - Path Units. May 16 16:39:47.449375 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 16 16:39:47.451073 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 16:39:47.451223 systemd[1]: Stopped target slices.target - Slice Units. May 16 16:39:47.451485 systemd[1]: Stopped target sockets.target - Socket Units. May 16 16:39:47.451671 systemd[1]: iscsid.socket: Deactivated successfully. May 16 16:39:47.451722 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 16 16:39:47.451867 systemd[1]: iscsiuio.socket: Deactivated successfully. May 16 16:39:47.451913 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 16 16:39:47.452090 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 16 16:39:47.452156 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 16 16:39:47.452390 systemd[1]: ignition-files.service: Deactivated successfully. May 16 16:39:47.452450 systemd[1]: Stopped ignition-files.service - Ignition (files). May 16 16:39:47.453022 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 16 16:39:47.456105 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 16 16:39:47.456343 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 16 16:39:47.456540 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 16 16:39:47.456886 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 16 16:39:47.457062 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 16 16:39:47.459251 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 16 16:39:47.464131 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 16 16:39:47.472256 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 16 16:39:47.473460 ignition[1099]: INFO : Ignition 2.21.0 May 16 16:39:47.473460 ignition[1099]: INFO : Stage: umount May 16 16:39:47.473460 ignition[1099]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 16:39:47.473460 ignition[1099]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 16 16:39:47.476671 ignition[1099]: INFO : umount: umount passed May 16 16:39:47.476671 ignition[1099]: INFO : Ignition finished successfully May 16 16:39:47.474750 systemd[1]: ignition-mount.service: Deactivated successfully. May 16 16:39:47.474800 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 16 16:39:47.476463 systemd[1]: Stopped target network.target - Network. May 16 16:39:47.476874 systemd[1]: ignition-disks.service: Deactivated successfully. May 16 16:39:47.476912 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 16 16:39:47.477269 systemd[1]: ignition-kargs.service: Deactivated successfully. May 16 16:39:47.477294 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 16 16:39:47.477501 systemd[1]: ignition-setup.service: Deactivated successfully. May 16 16:39:47.477522 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 16 16:39:47.477731 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 16 16:39:47.477753 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 16 16:39:47.478023 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 16 16:39:47.478479 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 16 16:39:47.480108 systemd[1]: systemd-resolved.service: Deactivated successfully. May 16 16:39:47.480169 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 16 16:39:47.481943 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 16 16:39:47.482222 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 16 16:39:47.482265 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 16:39:47.482973 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 16 16:39:47.486961 systemd[1]: systemd-networkd.service: Deactivated successfully. May 16 16:39:47.487028 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 16 16:39:47.487707 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 16 16:39:47.487812 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 16 16:39:47.488167 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 16 16:39:47.488184 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 16 16:39:47.490033 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 16 16:39:47.490161 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 16 16:39:47.490187 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 16 16:39:47.490363 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. May 16 16:39:47.490386 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. May 16 16:39:47.490545 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 16 16:39:47.490567 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 16 16:39:47.491146 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 16 16:39:47.491170 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 16 16:39:47.491313 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 16:39:47.492476 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 16 16:39:47.500430 systemd[1]: systemd-udevd.service: Deactivated successfully. May 16 16:39:47.500622 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 16:39:47.501322 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 16 16:39:47.501462 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 16 16:39:47.501737 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 16 16:39:47.501864 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 16 16:39:47.502145 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 16 16:39:47.502274 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 16 16:39:47.502653 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 16 16:39:47.502780 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 16 16:39:47.503103 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 16 16:39:47.503241 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 16 16:39:47.504154 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 16 16:39:47.505079 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 16 16:39:47.505223 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 16 16:39:47.505561 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 16 16:39:47.505585 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 16:39:47.506018 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 16 16:39:47.506041 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 16 16:39:47.506569 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 16 16:39:47.506596 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 16 16:39:47.508094 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 16 16:39:47.508122 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:39:47.508668 systemd[1]: network-cleanup.service: Deactivated successfully. May 16 16:39:47.511103 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 16 16:39:47.513932 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 16 16:39:47.513984 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 16 16:39:47.527662 systemd[1]: sysroot-boot.service: Deactivated successfully. May 16 16:39:47.527738 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 16 16:39:47.528229 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 16 16:39:47.528389 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 16 16:39:47.528430 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 16 16:39:47.529224 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 16 16:39:47.538510 systemd[1]: Switching root. May 16 16:39:47.576031 systemd-journald[245]: Journal stopped May 16 16:39:48.677585 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). May 16 16:39:48.677607 kernel: SELinux: policy capability network_peer_controls=1 May 16 16:39:48.677616 kernel: SELinux: policy capability open_perms=1 May 16 16:39:48.677622 kernel: SELinux: policy capability extended_socket_class=1 May 16 16:39:48.677627 kernel: SELinux: policy capability always_check_network=0 May 16 16:39:48.677634 kernel: SELinux: policy capability cgroup_seclabel=1 May 16 16:39:48.677640 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 16 16:39:48.677646 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 16 16:39:48.677652 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 16 16:39:48.677657 kernel: SELinux: policy capability userspace_initial_context=0 May 16 16:39:48.677663 kernel: audit: type=1403 audit(1747413588.146:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 16 16:39:48.677669 systemd[1]: Successfully loaded SELinux policy in 49.553ms. May 16 16:39:48.677678 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.742ms. May 16 16:39:48.677685 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 16 16:39:48.677692 systemd[1]: Detected virtualization vmware. May 16 16:39:48.677698 systemd[1]: Detected architecture x86-64. May 16 16:39:48.677706 systemd[1]: Detected first boot. May 16 16:39:48.677713 systemd[1]: Initializing machine ID from random generator. May 16 16:39:48.677796 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc May 16 16:39:48.677808 zram_generator::config[1142]: No configuration found. May 16 16:39:48.677815 kernel: Guest personality initialized and is active May 16 16:39:48.677821 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 16 16:39:48.677827 kernel: Initialized host personality May 16 16:39:48.677835 kernel: NET: Registered PF_VSOCK protocol family May 16 16:39:48.677842 systemd[1]: Populated /etc with preset unit settings. May 16 16:39:48.677849 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 16 16:39:48.677856 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" May 16 16:39:48.677863 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 16 16:39:48.677869 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 16 16:39:48.677876 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 16 16:39:48.677883 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 16 16:39:48.677891 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 16 16:39:48.677897 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 16 16:39:48.677904 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 16 16:39:48.677910 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 16 16:39:48.677917 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 16 16:39:48.677924 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 16 16:39:48.677932 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 16 16:39:48.677939 systemd[1]: Created slice user.slice - User and Session Slice. May 16 16:39:48.677946 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 16:39:48.677954 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 16:39:48.677961 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 16 16:39:48.677968 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 16 16:39:48.677975 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 16 16:39:48.677982 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 16 16:39:48.677990 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 16 16:39:48.677997 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 16:39:48.678003 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 16 16:39:48.678010 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 16 16:39:48.678017 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 16 16:39:48.678024 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 16 16:39:48.678030 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 16 16:39:48.678037 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 16:39:48.678045 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 16 16:39:48.678061 systemd[1]: Reached target slices.target - Slice Units. May 16 16:39:48.678069 systemd[1]: Reached target swap.target - Swaps. May 16 16:39:48.678075 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 16 16:39:48.678083 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 16 16:39:48.678091 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 16 16:39:48.678098 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 16 16:39:48.678104 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 16 16:39:48.678111 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 16 16:39:48.678118 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 16 16:39:48.678125 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 16 16:39:48.678132 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 16 16:39:48.678139 systemd[1]: Mounting media.mount - External Media Directory... May 16 16:39:48.678147 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:39:48.678154 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 16 16:39:48.678161 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 16 16:39:48.678168 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 16 16:39:48.678175 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 16 16:39:48.678181 systemd[1]: Reached target machines.target - Containers. May 16 16:39:48.678188 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 16 16:39:48.678195 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... May 16 16:39:48.678204 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 16 16:39:48.678211 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 16 16:39:48.678218 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 16 16:39:48.678225 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 16 16:39:48.678231 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 16 16:39:48.678238 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 16 16:39:48.678245 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 16 16:39:48.678252 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 16 16:39:48.678260 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 16 16:39:48.678267 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 16 16:39:48.678273 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 16 16:39:48.678280 systemd[1]: Stopped systemd-fsck-usr.service. May 16 16:39:48.678287 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 16:39:48.678294 systemd[1]: Starting systemd-journald.service - Journal Service... May 16 16:39:48.678301 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 16 16:39:48.678307 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 16 16:39:48.678314 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 16 16:39:48.680090 kernel: loop: module loaded May 16 16:39:48.680100 kernel: fuse: init (API version 7.41) May 16 16:39:48.680107 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 16 16:39:48.680124 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 16 16:39:48.680133 systemd[1]: verity-setup.service: Deactivated successfully. May 16 16:39:48.680141 systemd[1]: Stopped verity-setup.service. May 16 16:39:48.680148 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:39:48.680155 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 16 16:39:48.680164 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 16 16:39:48.680171 systemd[1]: Mounted media.mount - External Media Directory. May 16 16:39:48.680178 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 16 16:39:48.680184 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 16 16:39:48.680191 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 16 16:39:48.680198 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 16 16:39:48.680205 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 16 16:39:48.680212 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 16 16:39:48.680219 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 16 16:39:48.680227 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 16 16:39:48.680234 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 16 16:39:48.680255 systemd-journald[1239]: Collecting audit messages is disabled. May 16 16:39:48.680272 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 16 16:39:48.680280 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 16 16:39:48.680287 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 16 16:39:48.680295 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 16 16:39:48.680302 systemd-journald[1239]: Journal started May 16 16:39:48.680316 systemd-journald[1239]: Runtime Journal (/run/log/journal/a1e01b1bc17c4c18b3ef8aecf2e79ed0) is 4.8M, max 38.8M, 34M free. May 16 16:39:48.499263 systemd[1]: Queued start job for default target multi-user.target. May 16 16:39:48.519567 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 16 16:39:48.519836 systemd[1]: systemd-journald.service: Deactivated successfully. May 16 16:39:48.680795 jq[1212]: true May 16 16:39:48.682077 systemd[1]: Started systemd-journald.service - Journal Service. May 16 16:39:48.682356 systemd[1]: modprobe@loop.service: Deactivated successfully. May 16 16:39:48.688352 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 16 16:39:48.690082 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 16 16:39:48.690344 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 16 16:39:48.690591 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 16 16:39:48.698303 systemd[1]: Reached target network-pre.target - Preparation for Network. May 16 16:39:48.702151 jq[1257]: true May 16 16:39:48.702352 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 16 16:39:48.705186 kernel: ACPI: bus type drm_connector registered May 16 16:39:48.703972 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 16 16:39:48.704259 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 16 16:39:48.704278 systemd[1]: Reached target local-fs.target - Local File Systems. May 16 16:39:48.704973 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 16 16:39:48.708275 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 16 16:39:48.708451 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 16:39:48.711156 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 16 16:39:48.711990 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 16 16:39:48.712138 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 16 16:39:48.714174 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 16 16:39:48.714308 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 16 16:39:48.716138 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 16 16:39:48.719204 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 16 16:39:48.722809 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 16 16:39:48.723958 systemd[1]: modprobe@drm.service: Deactivated successfully. May 16 16:39:48.725085 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 16 16:39:48.725384 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 16 16:39:48.726046 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 16 16:39:48.727222 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 16 16:39:48.742890 systemd-journald[1239]: Time spent on flushing to /var/log/journal/a1e01b1bc17c4c18b3ef8aecf2e79ed0 is 32.112ms for 1760 entries. May 16 16:39:48.742890 systemd-journald[1239]: System Journal (/var/log/journal/a1e01b1bc17c4c18b3ef8aecf2e79ed0) is 8M, max 584.8M, 576.8M free. May 16 16:39:48.808441 systemd-journald[1239]: Received client request to flush runtime journal. May 16 16:39:48.808476 kernel: loop0: detected capacity change from 0 to 2960 May 16 16:39:48.785879 ignition[1285]: Ignition 2.21.0 May 16 16:39:48.745086 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 16 16:39:48.786080 ignition[1285]: deleting config from guestinfo properties May 16 16:39:48.746209 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 16 16:39:48.799990 ignition[1285]: Successfully deleted config May 16 16:39:48.749822 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 16 16:39:48.778981 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 16 16:39:48.805464 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). May 16 16:39:48.809710 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 16 16:39:48.816086 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 16 16:39:48.817061 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 16 16:39:48.818621 systemd-tmpfiles[1284]: ACLs are not supported, ignoring. May 16 16:39:48.818632 systemd-tmpfiles[1284]: ACLs are not supported, ignoring. May 16 16:39:48.825670 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 16 16:39:48.828152 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 16 16:39:48.841768 kernel: loop1: detected capacity change from 0 to 221472 May 16 16:39:48.869075 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 16 16:39:48.872175 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 16 16:39:48.889064 kernel: loop2: detected capacity change from 0 to 146240 May 16 16:39:48.896044 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 16 16:39:48.901013 systemd-tmpfiles[1313]: ACLs are not supported, ignoring. May 16 16:39:48.901235 systemd-tmpfiles[1313]: ACLs are not supported, ignoring. May 16 16:39:48.903781 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 16:39:48.939233 kernel: loop3: detected capacity change from 0 to 113872 May 16 16:39:48.974154 kernel: loop4: detected capacity change from 0 to 2960 May 16 16:39:48.992086 kernel: loop5: detected capacity change from 0 to 221472 May 16 16:39:49.013136 kernel: loop6: detected capacity change from 0 to 146240 May 16 16:39:49.044068 kernel: loop7: detected capacity change from 0 to 113872 May 16 16:39:49.216549 (sd-merge)[1320]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. May 16 16:39:49.219372 (sd-merge)[1320]: Merged extensions into '/usr'. May 16 16:39:49.226118 systemd[1]: Reload requested from client PID 1283 ('systemd-sysext') (unit systemd-sysext.service)... May 16 16:39:49.226127 systemd[1]: Reloading... May 16 16:39:49.266064 zram_generator::config[1343]: No configuration found. May 16 16:39:49.379668 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 16:39:49.388865 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 16 16:39:49.433763 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 16 16:39:49.433868 systemd[1]: Reloading finished in 207 ms. May 16 16:39:49.443345 ldconfig[1278]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 16 16:39:49.448958 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 16 16:39:49.449270 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 16 16:39:49.454854 systemd[1]: Starting ensure-sysext.service... May 16 16:39:49.455751 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 16 16:39:49.472594 systemd[1]: Reload requested from client PID 1402 ('systemctl') (unit ensure-sysext.service)... May 16 16:39:49.472604 systemd[1]: Reloading... May 16 16:39:49.474787 systemd-tmpfiles[1403]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 16 16:39:49.474808 systemd-tmpfiles[1403]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 16 16:39:49.475195 systemd-tmpfiles[1403]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 16 16:39:49.475346 systemd-tmpfiles[1403]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 16 16:39:49.475808 systemd-tmpfiles[1403]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 16 16:39:49.475966 systemd-tmpfiles[1403]: ACLs are not supported, ignoring. May 16 16:39:49.475999 systemd-tmpfiles[1403]: ACLs are not supported, ignoring. May 16 16:39:49.489317 systemd-tmpfiles[1403]: Detected autofs mount point /boot during canonicalization of boot. May 16 16:39:49.489421 systemd-tmpfiles[1403]: Skipping /boot May 16 16:39:49.496197 systemd-tmpfiles[1403]: Detected autofs mount point /boot during canonicalization of boot. May 16 16:39:49.498167 systemd-tmpfiles[1403]: Skipping /boot May 16 16:39:49.516060 zram_generator::config[1430]: No configuration found. May 16 16:39:49.588734 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 16:39:49.596761 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 16 16:39:49.640725 systemd[1]: Reloading finished in 167 ms. May 16 16:39:49.648524 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 16 16:39:49.651219 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 16:39:49.660801 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 16 16:39:49.663224 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 16 16:39:49.664593 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 16 16:39:49.666280 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 16 16:39:49.669474 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 16:39:49.673534 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 16 16:39:49.678343 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:39:49.681380 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 16 16:39:49.686801 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 16 16:39:49.688751 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 16 16:39:49.688898 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 16:39:49.688967 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 16:39:49.689037 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:39:49.692039 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 16 16:39:49.696496 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:39:49.696647 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 16:39:49.696736 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 16:39:49.696827 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:39:49.701321 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 16 16:39:49.705353 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 16 16:39:49.705778 systemd[1]: Finished ensure-sysext.service. May 16 16:39:49.705989 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 16 16:39:49.706105 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 16 16:39:49.708499 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:39:49.712129 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 16 16:39:49.712307 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 16:39:49.712329 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 16:39:49.715373 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 16 16:39:49.717583 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 16 16:39:49.717693 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:39:49.717866 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 16 16:39:49.717982 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 16 16:39:49.718213 systemd[1]: modprobe@loop.service: Deactivated successfully. May 16 16:39:49.718314 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 16 16:39:49.718846 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 16 16:39:49.718884 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 16 16:39:49.728366 augenrules[1526]: No rules May 16 16:39:49.728691 systemd[1]: audit-rules.service: Deactivated successfully. May 16 16:39:49.729118 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 16 16:39:49.730245 systemd[1]: modprobe@drm.service: Deactivated successfully. May 16 16:39:49.730563 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 16 16:39:49.731473 systemd-udevd[1495]: Using default interface naming scheme 'v255'. May 16 16:39:49.738384 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 16 16:39:49.748462 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 16 16:39:49.749618 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 16 16:39:49.749830 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 16 16:39:49.754151 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 16:39:49.758572 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 16 16:39:49.804410 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 16 16:39:49.806149 systemd[1]: Reached target time-set.target - System Time Set. May 16 16:39:49.860988 systemd-resolved[1492]: Positive Trust Anchors: May 16 16:39:49.860996 systemd-resolved[1492]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 16 16:39:49.861019 systemd-resolved[1492]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 16 16:39:49.861657 systemd-networkd[1541]: lo: Link UP May 16 16:39:49.863065 systemd-networkd[1541]: lo: Gained carrier May 16 16:39:49.863509 systemd-networkd[1541]: Enumeration completed May 16 16:39:49.863555 systemd[1]: Started systemd-networkd.service - Network Configuration. May 16 16:39:49.865199 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 16 16:39:49.866755 systemd-resolved[1492]: Defaulting to hostname 'linux'. May 16 16:39:49.867177 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 16 16:39:49.870616 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 16 16:39:49.870742 systemd[1]: Reached target network.target - Network. May 16 16:39:49.870826 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 16 16:39:49.871097 systemd[1]: Reached target sysinit.target - System Initialization. May 16 16:39:49.871253 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 16 16:39:49.871533 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 16 16:39:49.871949 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 16 16:39:49.872153 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 16 16:39:49.872280 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 16 16:39:49.872492 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 16 16:39:49.872602 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 16 16:39:49.872618 systemd[1]: Reached target paths.target - Path Units. May 16 16:39:49.872929 systemd[1]: Reached target timers.target - Timer Units. May 16 16:39:49.873669 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 16 16:39:49.875473 systemd[1]: Starting docker.socket - Docker Socket for the API... May 16 16:39:49.877928 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 16 16:39:49.878576 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 16 16:39:49.878687 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 16 16:39:49.880706 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 16 16:39:49.880984 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 16 16:39:49.881959 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 16 16:39:49.883476 systemd[1]: Reached target sockets.target - Socket Units. May 16 16:39:49.883656 systemd[1]: Reached target basic.target - Basic System. May 16 16:39:49.885156 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 16 16:39:49.885172 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 16 16:39:49.889204 systemd[1]: Starting containerd.service - containerd container runtime... May 16 16:39:49.891398 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 16 16:39:49.897351 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 16 16:39:49.899669 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 16 16:39:49.900517 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 16 16:39:49.900624 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 16 16:39:49.903290 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 16 16:39:49.906150 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 16 16:39:49.908971 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 16 16:39:49.914511 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 16 16:39:49.917496 jq[1578]: false May 16 16:39:49.918179 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 16 16:39:49.921285 systemd[1]: Starting systemd-logind.service - User Login Management... May 16 16:39:49.921841 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 16 16:39:49.923577 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Refreshing passwd entry cache May 16 16:39:49.923581 oslogin_cache_refresh[1580]: Refreshing passwd entry cache May 16 16:39:49.925345 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 16 16:39:49.926239 systemd[1]: Starting update-engine.service - Update Engine... May 16 16:39:49.927121 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 16 16:39:49.931662 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... May 16 16:39:49.934216 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 16 16:39:49.936056 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Failure getting users, quitting May 16 16:39:49.936056 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 16 16:39:49.936056 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Refreshing group entry cache May 16 16:39:49.936056 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Failure getting groups, quitting May 16 16:39:49.936056 google_oslogin_nss_cache[1580]: oslogin_cache_refresh[1580]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 16 16:39:49.935494 oslogin_cache_refresh[1580]: Failure getting users, quitting May 16 16:39:49.935504 oslogin_cache_refresh[1580]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 16 16:39:49.935525 oslogin_cache_refresh[1580]: Refreshing group entry cache May 16 16:39:49.935808 oslogin_cache_refresh[1580]: Failure getting groups, quitting May 16 16:39:49.935812 oslogin_cache_refresh[1580]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 16 16:39:49.936299 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 16 16:39:49.936526 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 16 16:39:49.937078 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 16 16:39:49.940662 extend-filesystems[1579]: Found loop4 May 16 16:39:49.940662 extend-filesystems[1579]: Found loop5 May 16 16:39:49.940662 extend-filesystems[1579]: Found loop6 May 16 16:39:49.940662 extend-filesystems[1579]: Found loop7 May 16 16:39:49.940662 extend-filesystems[1579]: Found sda May 16 16:39:49.940662 extend-filesystems[1579]: Found sda1 May 16 16:39:49.940662 extend-filesystems[1579]: Found sda2 May 16 16:39:49.940662 extend-filesystems[1579]: Found sda3 May 16 16:39:49.940662 extend-filesystems[1579]: Found usr May 16 16:39:49.940662 extend-filesystems[1579]: Found sda4 May 16 16:39:49.940662 extend-filesystems[1579]: Found sda6 May 16 16:39:49.940662 extend-filesystems[1579]: Found sda7 May 16 16:39:49.940662 extend-filesystems[1579]: Found sda9 May 16 16:39:49.940662 extend-filesystems[1579]: Found sr0 May 16 16:39:49.941333 systemd[1]: extend-filesystems.service: Deactivated successfully. May 16 16:39:49.941447 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 16 16:39:49.941696 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 16 16:39:49.942166 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 16 16:39:49.942395 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 16 16:39:49.942492 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 16 16:39:49.946743 systemd[1]: motdgen.service: Deactivated successfully. May 16 16:39:49.950179 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 16 16:39:49.960699 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 16 16:39:49.968277 jq[1593]: true May 16 16:39:49.969984 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. May 16 16:39:49.977530 (ntainerd)[1610]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 16 16:39:49.981681 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... May 16 16:39:49.987117 update_engine[1592]: I20250516 16:39:49.986721 1592 main.cc:92] Flatcar Update Engine starting May 16 16:39:49.988650 tar[1600]: linux-amd64/helm May 16 16:39:50.001537 dbus-daemon[1576]: [system] SELinux support is enabled May 16 16:39:50.001626 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 16 16:39:50.004970 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 16 16:39:50.004987 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 16 16:39:50.005308 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 16 16:39:50.005318 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 16 16:39:50.008853 jq[1618]: true May 16 16:39:50.014199 systemd[1]: Started update-engine.service - Update Engine. May 16 16:39:50.016448 update_engine[1592]: I20250516 16:39:50.015322 1592 update_check_scheduler.cc:74] Next update check in 2m59s May 16 16:39:50.029885 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated May 16 16:39:50.030028 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps May 16 16:39:50.027708 systemd-networkd[1541]: ens192: Configuring with /etc/systemd/network/00-vmware.network. May 16 16:39:50.043660 systemd-networkd[1541]: ens192: Link UP May 16 16:39:50.043790 systemd-networkd[1541]: ens192: Gained carrier May 16 16:39:50.053394 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. May 16 16:39:50.057171 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 16 16:39:50.059501 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. May 16 16:39:50.062643 unknown[1617]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath May 16 16:39:50.071021 unknown[1617]: Core dump limit set to -1 May 16 16:39:50.111946 bash[1646]: Updated "/home/core/.ssh/authorized_keys" May 16 16:39:50.112778 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 16 16:39:50.113225 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 16 16:39:50.123632 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. May 16 16:39:50.124443 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 16 16:39:50.138275 systemd-logind[1588]: New seat seat0. May 16 16:39:50.142280 systemd[1]: Started systemd-logind.service - User Login Management. May 16 16:39:50.149733 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 16 16:39:50.157079 kernel: mousedev: PS/2 mouse device common for all mice May 16 16:39:50.175419 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 May 16 16:39:50.188397 kernel: ACPI: button: Power Button [PWRF] May 16 16:39:50.238945 locksmithd[1625]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 16 16:39:50.336791 containerd[1610]: time="2025-05-16T16:39:50Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 16 16:39:50.341737 containerd[1610]: time="2025-05-16T16:39:50.341719501Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 16 16:39:50.361785 sshd_keygen[1605]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 16 16:39:50.366040 containerd[1610]: time="2025-05-16T16:39:50.365932833Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.071µs" May 16 16:39:50.366040 containerd[1610]: time="2025-05-16T16:39:50.365952886Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 16 16:39:50.366040 containerd[1610]: time="2025-05-16T16:39:50.365964272Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 16 16:39:50.366139 containerd[1610]: time="2025-05-16T16:39:50.366039758Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 16 16:39:50.366139 containerd[1610]: time="2025-05-16T16:39:50.366056033Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.366398556Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.366445547Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.366453581Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.366813308Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.366823710Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.366830599Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.366835588Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.366879665Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.367581273Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.367601405Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 16 16:39:50.368064 containerd[1610]: time="2025-05-16T16:39:50.367608217Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 16 16:39:50.368214 containerd[1610]: time="2025-05-16T16:39:50.367707137Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 16 16:39:50.368453 containerd[1610]: time="2025-05-16T16:39:50.368439033Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 16 16:39:50.368490 containerd[1610]: time="2025-05-16T16:39:50.368478036Z" level=info msg="metadata content store policy set" policy=shared May 16 16:39:50.387223 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 16 16:39:50.388719 systemd[1]: Starting issuegen.service - Generate /run/issue... May 16 16:39:50.402026 systemd[1]: issuegen.service: Deactivated successfully. May 16 16:39:50.402697 systemd[1]: Finished issuegen.service - Generate /run/issue. May 16 16:39:50.404302 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 16 16:39:50.427099 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! May 16 16:39:50.449713 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 16 16:39:50.450731 systemd[1]: Started getty@tty1.service - Getty on tty1. May 16 16:39:50.453274 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 16 16:39:50.453451 systemd[1]: Reached target getty.target - Login Prompts. May 16 16:39:50.454590 (udev-worker)[1551]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. May 16 16:39:50.473776 systemd-logind[1588]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 16 16:39:50.480358 systemd-logind[1588]: Watching system buttons on /dev/input/event2 (Power Button) May 16 16:39:50.488202 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704560733Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704601435Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704611269Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704618121Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704625699Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704631408Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704638323Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704644558Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704651377Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704657150Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704662206Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704669006Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704739526Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 16 16:39:50.705593 containerd[1610]: time="2025-05-16T16:39:50.704751661Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704762663Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704768789Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704774939Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704780958Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704787340Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704792742Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704799252Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704804950Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704811404Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704848745Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704856433Z" level=info msg="Start snapshots syncer" May 16 16:39:50.705801 containerd[1610]: time="2025-05-16T16:39:50.704870081Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 16 16:39:50.705954 containerd[1610]: time="2025-05-16T16:39:50.705012314Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 16 16:39:50.705954 containerd[1610]: time="2025-05-16T16:39:50.705042842Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 16 16:39:50.706667 containerd[1610]: time="2025-05-16T16:39:50.706613204Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706819713Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706836336Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706843633Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706852229Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706859108Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706866742Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706873200Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706887438Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706894463Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 16 16:39:50.706917 containerd[1610]: time="2025-05-16T16:39:50.706900478Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707439636Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707454308Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707460256Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707465807Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707470577Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707476735Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707505608Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707517360Z" level=info msg="runtime interface created" May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707520444Z" level=info msg="created NRI interface" May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707525368Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707532776Z" level=info msg="Connect containerd service" May 16 16:39:50.707658 containerd[1610]: time="2025-05-16T16:39:50.707549863Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 16 16:39:50.709269 containerd[1610]: time="2025-05-16T16:39:50.709256520Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 16 16:39:50.806009 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845562731Z" level=info msg="Start subscribing containerd event" May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845597939Z" level=info msg="Start recovering state" May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845664067Z" level=info msg="Start event monitor" May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845669045Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845676226Z" level=info msg="Start cni network conf syncer for default" May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845696640Z" level=info msg="Start streaming server" May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845704292Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845706740Z" level=info msg=serving... address=/run/containerd/containerd.sock May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845708182Z" level=info msg="runtime interface starting up..." May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845734417Z" level=info msg="starting plugins..." May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845743469Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 16 16:39:50.846675 containerd[1610]: time="2025-05-16T16:39:50.845805114Z" level=info msg="containerd successfully booted in 0.509462s" May 16 16:39:50.845888 systemd[1]: Started containerd.service - containerd container runtime. May 16 16:39:50.882717 tar[1600]: linux-amd64/LICENSE May 16 16:39:50.882845 tar[1600]: linux-amd64/README.md May 16 16:39:50.896017 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 16 16:39:51.658326 systemd-networkd[1541]: ens192: Gained IPv6LL May 16 16:39:51.658896 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. May 16 16:39:51.660425 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 16 16:39:51.660914 systemd[1]: Reached target network-online.target - Network is Online. May 16 16:39:51.662382 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... May 16 16:39:51.664261 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:39:51.665878 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 16 16:39:51.690597 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 16 16:39:51.696191 systemd[1]: coreos-metadata.service: Deactivated successfully. May 16 16:39:51.696338 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. May 16 16:39:51.696656 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 16 16:39:52.518938 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:39:52.519348 systemd[1]: Reached target multi-user.target - Multi-User System. May 16 16:39:52.521310 systemd[1]: Startup finished in 2.422s (kernel) + 5.537s (initrd) + 4.423s (userspace) = 12.383s. May 16 16:39:52.525342 (kubelet)[1795]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 16:39:52.557183 login[1735]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 16 16:39:52.557441 login[1734]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 16 16:39:52.561778 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 16 16:39:52.562929 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 16 16:39:52.567692 systemd-logind[1588]: New session 1 of user core. May 16 16:39:52.570201 systemd-logind[1588]: New session 2 of user core. May 16 16:39:52.574979 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 16 16:39:52.577046 systemd[1]: Starting user@500.service - User Manager for UID 500... May 16 16:39:52.585081 (systemd)[1802]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 16 16:39:52.587096 systemd-logind[1588]: New session c1 of user core. May 16 16:39:52.679747 systemd[1802]: Queued start job for default target default.target. May 16 16:39:52.683816 systemd[1802]: Created slice app.slice - User Application Slice. May 16 16:39:52.683833 systemd[1802]: Reached target paths.target - Paths. May 16 16:39:52.683856 systemd[1802]: Reached target timers.target - Timers. May 16 16:39:52.684615 systemd[1802]: Starting dbus.socket - D-Bus User Message Bus Socket... May 16 16:39:52.697152 systemd[1802]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 16 16:39:52.697218 systemd[1802]: Reached target sockets.target - Sockets. May 16 16:39:52.697246 systemd[1802]: Reached target basic.target - Basic System. May 16 16:39:52.697268 systemd[1802]: Reached target default.target - Main User Target. May 16 16:39:52.697285 systemd[1802]: Startup finished in 106ms. May 16 16:39:52.697288 systemd[1]: Started user@500.service - User Manager for UID 500. May 16 16:39:52.704243 systemd[1]: Started session-1.scope - Session 1 of User core. May 16 16:39:52.705396 systemd[1]: Started session-2.scope - Session 2 of User core. May 16 16:39:53.530725 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. May 16 16:39:53.908894 kubelet[1795]: E0516 16:39:53.908829 1795 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 16:39:53.910332 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 16:39:53.910417 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 16:39:53.910615 systemd[1]: kubelet.service: Consumed 626ms CPU time, 264.4M memory peak. May 16 16:40:03.913509 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 16 16:40:03.914807 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:40:04.240957 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:40:04.244232 (kubelet)[1846]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 16:40:04.298115 kubelet[1846]: E0516 16:40:04.298080 1846 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 16:40:04.300651 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 16:40:04.300805 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 16:40:04.301201 systemd[1]: kubelet.service: Consumed 102ms CPU time, 108.9M memory peak. May 16 16:40:14.413346 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 16 16:40:14.414715 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:40:14.861986 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:40:14.864501 (kubelet)[1861]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 16:40:14.924236 kubelet[1861]: E0516 16:40:14.924202 1861 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 16:40:14.925760 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 16:40:14.925902 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 16:40:14.926174 systemd[1]: kubelet.service: Consumed 91ms CPU time, 108M memory peak. May 16 16:40:20.179936 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 16 16:40:20.182194 systemd[1]: Started sshd@0-139.178.70.106:22-147.75.109.163:54596.service - OpenSSH per-connection server daemon (147.75.109.163:54596). May 16 16:40:20.310114 sshd[1869]: Accepted publickey for core from 147.75.109.163 port 54596 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:40:20.311182 sshd-session[1869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:40:20.313748 systemd-logind[1588]: New session 3 of user core. May 16 16:40:20.324239 systemd[1]: Started session-3.scope - Session 3 of User core. May 16 16:40:20.376193 systemd[1]: Started sshd@1-139.178.70.106:22-147.75.109.163:54610.service - OpenSSH per-connection server daemon (147.75.109.163:54610). May 16 16:40:20.417849 sshd[1874]: Accepted publickey for core from 147.75.109.163 port 54610 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:40:20.418933 sshd-session[1874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:40:20.422617 systemd-logind[1588]: New session 4 of user core. May 16 16:40:20.431204 systemd[1]: Started session-4.scope - Session 4 of User core. May 16 16:40:20.478992 sshd[1876]: Connection closed by 147.75.109.163 port 54610 May 16 16:40:20.479362 sshd-session[1874]: pam_unix(sshd:session): session closed for user core May 16 16:40:20.488439 systemd[1]: sshd@1-139.178.70.106:22-147.75.109.163:54610.service: Deactivated successfully. May 16 16:40:20.489499 systemd[1]: session-4.scope: Deactivated successfully. May 16 16:40:20.490002 systemd-logind[1588]: Session 4 logged out. Waiting for processes to exit. May 16 16:40:20.491706 systemd[1]: Started sshd@2-139.178.70.106:22-147.75.109.163:54626.service - OpenSSH per-connection server daemon (147.75.109.163:54626). May 16 16:40:20.492450 systemd-logind[1588]: Removed session 4. May 16 16:40:20.535828 sshd[1882]: Accepted publickey for core from 147.75.109.163 port 54626 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:40:20.536722 sshd-session[1882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:40:20.539466 systemd-logind[1588]: New session 5 of user core. May 16 16:40:20.546202 systemd[1]: Started session-5.scope - Session 5 of User core. May 16 16:40:20.592039 sshd[1884]: Connection closed by 147.75.109.163 port 54626 May 16 16:40:20.592425 sshd-session[1882]: pam_unix(sshd:session): session closed for user core May 16 16:40:20.600121 systemd[1]: sshd@2-139.178.70.106:22-147.75.109.163:54626.service: Deactivated successfully. May 16 16:40:20.601058 systemd[1]: session-5.scope: Deactivated successfully. May 16 16:40:20.601806 systemd-logind[1588]: Session 5 logged out. Waiting for processes to exit. May 16 16:40:20.602755 systemd[1]: Started sshd@3-139.178.70.106:22-147.75.109.163:54640.service - OpenSSH per-connection server daemon (147.75.109.163:54640). May 16 16:40:20.604489 systemd-logind[1588]: Removed session 5. May 16 16:40:20.641381 sshd[1890]: Accepted publickey for core from 147.75.109.163 port 54640 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:40:20.642227 sshd-session[1890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:40:20.645865 systemd-logind[1588]: New session 6 of user core. May 16 16:40:20.653224 systemd[1]: Started session-6.scope - Session 6 of User core. May 16 16:40:20.702713 sshd[1892]: Connection closed by 147.75.109.163 port 54640 May 16 16:40:20.703261 sshd-session[1890]: pam_unix(sshd:session): session closed for user core May 16 16:40:20.709398 systemd[1]: sshd@3-139.178.70.106:22-147.75.109.163:54640.service: Deactivated successfully. May 16 16:40:20.710525 systemd[1]: session-6.scope: Deactivated successfully. May 16 16:40:20.711080 systemd-logind[1588]: Session 6 logged out. Waiting for processes to exit. May 16 16:40:20.712901 systemd[1]: Started sshd@4-139.178.70.106:22-147.75.109.163:54648.service - OpenSSH per-connection server daemon (147.75.109.163:54648). May 16 16:40:20.713674 systemd-logind[1588]: Removed session 6. May 16 16:40:20.753348 sshd[1898]: Accepted publickey for core from 147.75.109.163 port 54648 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:40:20.754202 sshd-session[1898]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:40:20.757076 systemd-logind[1588]: New session 7 of user core. May 16 16:40:20.767196 systemd[1]: Started session-7.scope - Session 7 of User core. May 16 16:40:20.824482 sudo[1901]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 16 16:40:20.824671 sudo[1901]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 16:40:20.837571 sudo[1901]: pam_unix(sudo:session): session closed for user root May 16 16:40:20.838480 sshd[1900]: Connection closed by 147.75.109.163 port 54648 May 16 16:40:20.838917 sshd-session[1898]: pam_unix(sshd:session): session closed for user core May 16 16:40:20.848275 systemd[1]: sshd@4-139.178.70.106:22-147.75.109.163:54648.service: Deactivated successfully. May 16 16:40:20.849232 systemd[1]: session-7.scope: Deactivated successfully. May 16 16:40:20.849720 systemd-logind[1588]: Session 7 logged out. Waiting for processes to exit. May 16 16:40:20.851311 systemd[1]: Started sshd@5-139.178.70.106:22-147.75.109.163:54652.service - OpenSSH per-connection server daemon (147.75.109.163:54652). May 16 16:40:20.851960 systemd-logind[1588]: Removed session 7. May 16 16:40:20.895962 sshd[1907]: Accepted publickey for core from 147.75.109.163 port 54652 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:40:20.896849 sshd-session[1907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:40:20.899526 systemd-logind[1588]: New session 8 of user core. May 16 16:40:20.908202 systemd[1]: Started session-8.scope - Session 8 of User core. May 16 16:40:20.956825 sudo[1911]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 16 16:40:20.957218 sudo[1911]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 16:40:20.960618 sudo[1911]: pam_unix(sudo:session): session closed for user root May 16 16:40:20.964807 sudo[1910]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 16 16:40:20.965328 sudo[1910]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 16:40:20.972433 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 16 16:40:21.000864 augenrules[1933]: No rules May 16 16:40:21.001510 systemd[1]: audit-rules.service: Deactivated successfully. May 16 16:40:21.001736 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 16 16:40:21.002454 sudo[1910]: pam_unix(sudo:session): session closed for user root May 16 16:40:21.003213 sshd[1909]: Connection closed by 147.75.109.163 port 54652 May 16 16:40:21.003884 sshd-session[1907]: pam_unix(sshd:session): session closed for user core May 16 16:40:21.008863 systemd[1]: sshd@5-139.178.70.106:22-147.75.109.163:54652.service: Deactivated successfully. May 16 16:40:21.009706 systemd[1]: session-8.scope: Deactivated successfully. May 16 16:40:21.010185 systemd-logind[1588]: Session 8 logged out. Waiting for processes to exit. May 16 16:40:21.011416 systemd[1]: Started sshd@6-139.178.70.106:22-147.75.109.163:54664.service - OpenSSH per-connection server daemon (147.75.109.163:54664). May 16 16:40:21.013268 systemd-logind[1588]: Removed session 8. May 16 16:40:21.052985 sshd[1942]: Accepted publickey for core from 147.75.109.163 port 54664 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:40:21.054545 sshd-session[1942]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:40:21.059930 systemd-logind[1588]: New session 9 of user core. May 16 16:40:21.066183 systemd[1]: Started session-9.scope - Session 9 of User core. May 16 16:40:21.115406 sudo[1945]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 16 16:40:21.115563 sudo[1945]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 16:40:21.505574 systemd[1]: Starting docker.service - Docker Application Container Engine... May 16 16:40:21.515374 (dockerd)[1963]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 16 16:40:21.769419 dockerd[1963]: time="2025-05-16T16:40:21.769205063Z" level=info msg="Starting up" May 16 16:40:21.770461 dockerd[1963]: time="2025-05-16T16:40:21.770404849Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 16 16:40:21.813277 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2676480215-merged.mount: Deactivated successfully. May 16 16:40:21.864622 dockerd[1963]: time="2025-05-16T16:40:21.864593214Z" level=info msg="Loading containers: start." May 16 16:40:21.904064 kernel: Initializing XFRM netlink socket May 16 16:40:22.099628 systemd-timesyncd[1522]: Network configuration changed, trying to establish connection. May 16 16:40:22.157423 systemd-networkd[1541]: docker0: Link UP May 16 16:40:22.169772 dockerd[1963]: time="2025-05-16T16:40:22.169697627Z" level=info msg="Loading containers: done." May 16 16:40:22.183568 dockerd[1963]: time="2025-05-16T16:40:22.183531432Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 16 16:40:22.183669 dockerd[1963]: time="2025-05-16T16:40:22.183595253Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 16 16:40:22.183711 dockerd[1963]: time="2025-05-16T16:40:22.183669762Z" level=info msg="Initializing buildkit" May 16 16:40:22.196137 dockerd[1963]: time="2025-05-16T16:40:22.196106211Z" level=info msg="Completed buildkit initialization" May 16 16:40:22.202517 dockerd[1963]: time="2025-05-16T16:40:22.201550560Z" level=info msg="Daemon has completed initialization" May 16 16:40:22.202517 dockerd[1963]: time="2025-05-16T16:40:22.201589243Z" level=info msg="API listen on /run/docker.sock" May 16 16:40:22.202077 systemd[1]: Started docker.service - Docker Application Container Engine. May 16 16:41:43.498771 systemd-resolved[1492]: Clock change detected. Flushing caches. May 16 16:41:43.499504 systemd-timesyncd[1522]: Contacted time server 208.113.130.146:123 (2.flatcar.pool.ntp.org). May 16 16:41:43.499563 systemd-timesyncd[1522]: Initial clock synchronization to Fri 2025-05-16 16:41:43.498623 UTC. May 16 16:41:44.934090 containerd[1610]: time="2025-05-16T16:41:44.934055846Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.9\"" May 16 16:41:45.542394 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount775722935.mount: Deactivated successfully. May 16 16:41:46.413952 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 16 16:41:46.415968 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:41:46.666392 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:41:46.673037 (kubelet)[2227]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 16:41:46.698092 kubelet[2227]: E0516 16:41:46.698045 2227 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 16:41:46.699579 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 16:41:46.699684 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 16:41:46.700037 systemd[1]: kubelet.service: Consumed 99ms CPU time, 110.2M memory peak. May 16 16:41:46.712826 containerd[1610]: time="2025-05-16T16:41:46.712784407Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:46.726285 containerd[1610]: time="2025-05-16T16:41:46.726247320Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.9: active requests=0, bytes read=28078845" May 16 16:41:46.736083 containerd[1610]: time="2025-05-16T16:41:46.736031297Z" level=info msg="ImageCreate event name:\"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:46.745743 containerd[1610]: time="2025-05-16T16:41:46.745688892Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5b68f0df22013422dc8fb9ddfcff513eb6fc92f9dbf8aae41555c895efef5a20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:46.746695 containerd[1610]: time="2025-05-16T16:41:46.746654945Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.9\" with image id \"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5b68f0df22013422dc8fb9ddfcff513eb6fc92f9dbf8aae41555c895efef5a20\", size \"28075645\" in 1.81256888s" May 16 16:41:46.746695 containerd[1610]: time="2025-05-16T16:41:46.746676720Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.9\" returns image reference \"sha256:0c19e0eafbdfffa1317cf99a16478265a4cd746ef677de27b0be6a8b515f36b1\"" May 16 16:41:46.747247 containerd[1610]: time="2025-05-16T16:41:46.747235977Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.9\"" May 16 16:41:48.479755 containerd[1610]: time="2025-05-16T16:41:48.479228676Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:48.495790 containerd[1610]: time="2025-05-16T16:41:48.495746326Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.9: active requests=0, bytes read=24713522" May 16 16:41:48.524827 containerd[1610]: time="2025-05-16T16:41:48.524785600Z" level=info msg="ImageCreate event name:\"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:48.542744 containerd[1610]: time="2025-05-16T16:41:48.542668439Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:be9e7987d323b38a12e28436cff6d6ec6fc31ffdd3ea11eaa9d74852e9d31248\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:48.543166 containerd[1610]: time="2025-05-16T16:41:48.543149282Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.9\" with image id \"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:be9e7987d323b38a12e28436cff6d6ec6fc31ffdd3ea11eaa9d74852e9d31248\", size \"26315362\" in 1.795810605s" May 16 16:41:48.543229 containerd[1610]: time="2025-05-16T16:41:48.543219111Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.9\" returns image reference \"sha256:6aa3d581404ae6ae5dc355cb750aaedec843d2c99263d28fce50277e8e2a6ec2\"" May 16 16:41:48.543621 containerd[1610]: time="2025-05-16T16:41:48.543576895Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.9\"" May 16 16:41:49.862481 containerd[1610]: time="2025-05-16T16:41:49.861893157Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:49.864173 containerd[1610]: time="2025-05-16T16:41:49.864152453Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.9: active requests=0, bytes read=18784311" May 16 16:41:49.868758 containerd[1610]: time="2025-05-16T16:41:49.868738079Z" level=info msg="ImageCreate event name:\"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:49.873582 containerd[1610]: time="2025-05-16T16:41:49.873566904Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:eb358c7346bb17ab2c639c3ff8ab76a147dec7ae609f5c0c2800233e42253ed1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:49.874067 containerd[1610]: time="2025-05-16T16:41:49.874054840Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.9\" with image id \"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:eb358c7346bb17ab2c639c3ff8ab76a147dec7ae609f5c0c2800233e42253ed1\", size \"20386169\" in 1.33043408s" May 16 16:41:49.874115 containerd[1610]: time="2025-05-16T16:41:49.874108443Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.9\" returns image reference \"sha256:737ed3eafaf27a28ea9e13b736011bfed5bd349785ac6bc220b34eaf4adc51e3\"" May 16 16:41:49.874403 containerd[1610]: time="2025-05-16T16:41:49.874384457Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.9\"" May 16 16:41:51.175008 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1550676684.mount: Deactivated successfully. May 16 16:41:51.595605 containerd[1610]: time="2025-05-16T16:41:51.595108205Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:51.602314 containerd[1610]: time="2025-05-16T16:41:51.602288816Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.9: active requests=0, bytes read=30355623" May 16 16:41:51.612043 containerd[1610]: time="2025-05-16T16:41:51.611976324Z" level=info msg="ImageCreate event name:\"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:51.621984 containerd[1610]: time="2025-05-16T16:41:51.621937844Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:fdf026cf2434537e499e9c739d189ca8fc57101d929ac5ccd8e24f979a9738c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:51.622548 containerd[1610]: time="2025-05-16T16:41:51.622203769Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.9\" with image id \"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\", repo tag \"registry.k8s.io/kube-proxy:v1.31.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:fdf026cf2434537e499e9c739d189ca8fc57101d929ac5ccd8e24f979a9738c1\", size \"30354642\" in 1.747802838s" May 16 16:41:51.622548 containerd[1610]: time="2025-05-16T16:41:51.622225227Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.9\" returns image reference \"sha256:11a47a71ed3ecf643e15a11990daed3b656279449ba9344db0b54652c4723578\"" May 16 16:41:51.622548 containerd[1610]: time="2025-05-16T16:41:51.622527717Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 16 16:41:52.256607 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2147563775.mount: Deactivated successfully. May 16 16:41:53.029908 containerd[1610]: time="2025-05-16T16:41:53.029864154Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:53.040266 containerd[1610]: time="2025-05-16T16:41:53.040219963Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" May 16 16:41:53.045647 containerd[1610]: time="2025-05-16T16:41:53.045603573Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:53.060313 containerd[1610]: time="2025-05-16T16:41:53.060256215Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:53.060976 containerd[1610]: time="2025-05-16T16:41:53.060872033Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.438319502s" May 16 16:41:53.060976 containerd[1610]: time="2025-05-16T16:41:53.060890225Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 16 16:41:53.061296 containerd[1610]: time="2025-05-16T16:41:53.061271545Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 16 16:41:53.536341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2171626772.mount: Deactivated successfully. May 16 16:41:53.539600 containerd[1610]: time="2025-05-16T16:41:53.539568337Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 16:41:53.540234 containerd[1610]: time="2025-05-16T16:41:53.540216087Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 16 16:41:53.541113 containerd[1610]: time="2025-05-16T16:41:53.541093263Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 16:41:53.542360 containerd[1610]: time="2025-05-16T16:41:53.542332388Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 16:41:53.542719 containerd[1610]: time="2025-05-16T16:41:53.542640365Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 481.351972ms" May 16 16:41:53.542719 containerd[1610]: time="2025-05-16T16:41:53.542658737Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 16 16:41:53.542955 containerd[1610]: time="2025-05-16T16:41:53.542935820Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 16 16:41:54.064718 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2676468417.mount: Deactivated successfully. May 16 16:41:56.399100 update_engine[1592]: I20250516 16:41:56.399034 1592 update_attempter.cc:509] Updating boot flags... May 16 16:41:56.913949 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 16 16:41:56.915934 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:41:58.618313 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:41:58.620962 (kubelet)[2386]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 16:41:58.624531 containerd[1610]: time="2025-05-16T16:41:58.624489080Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:58.629746 containerd[1610]: time="2025-05-16T16:41:58.629705842Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780013" May 16 16:41:58.650621 containerd[1610]: time="2025-05-16T16:41:58.650574626Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:58.660406 containerd[1610]: time="2025-05-16T16:41:58.660243306Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:58.662339 containerd[1610]: time="2025-05-16T16:41:58.660660353Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 5.1177058s" May 16 16:41:58.662339 containerd[1610]: time="2025-05-16T16:41:58.660682036Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 16 16:41:58.836600 kubelet[2386]: E0516 16:41:58.836574 2386 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 16:41:58.837831 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 16:41:58.837913 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 16:41:58.838237 systemd[1]: kubelet.service: Consumed 121ms CPU time, 109M memory peak. May 16 16:42:00.918818 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:00.919176 systemd[1]: kubelet.service: Consumed 121ms CPU time, 109M memory peak. May 16 16:42:00.920846 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:00.940050 systemd[1]: Reload requested from client PID 2417 ('systemctl') (unit session-9.scope)... May 16 16:42:00.940060 systemd[1]: Reloading... May 16 16:42:01.019755 zram_generator::config[2470]: No configuration found. May 16 16:42:01.066084 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 16:42:01.074067 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 16 16:42:01.139944 systemd[1]: Reloading finished in 199 ms. May 16 16:42:01.165367 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 16 16:42:01.165421 systemd[1]: kubelet.service: Failed with result 'signal'. May 16 16:42:01.165602 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:01.166843 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:01.534628 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:01.540079 (kubelet)[2528]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 16 16:42:01.606910 kubelet[2528]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 16:42:01.606910 kubelet[2528]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 16 16:42:01.606910 kubelet[2528]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 16:42:01.607155 kubelet[2528]: I0516 16:42:01.606963 2528 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 16 16:42:01.863323 kubelet[2528]: I0516 16:42:01.862821 2528 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 16 16:42:01.863323 kubelet[2528]: I0516 16:42:01.862839 2528 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 16 16:42:01.863323 kubelet[2528]: I0516 16:42:01.863000 2528 server.go:934] "Client rotation is on, will bootstrap in background" May 16 16:42:01.892984 kubelet[2528]: E0516 16:42:01.892955 2528 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.106:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:01.893121 kubelet[2528]: I0516 16:42:01.893108 2528 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 16 16:42:01.902673 kubelet[2528]: I0516 16:42:01.902659 2528 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 16 16:42:01.907268 kubelet[2528]: I0516 16:42:01.907254 2528 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 16 16:42:01.909272 kubelet[2528]: I0516 16:42:01.909246 2528 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 16 16:42:01.909602 kubelet[2528]: I0516 16:42:01.909387 2528 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 16 16:42:01.909602 kubelet[2528]: I0516 16:42:01.909410 2528 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 16 16:42:01.909602 kubelet[2528]: I0516 16:42:01.909517 2528 topology_manager.go:138] "Creating topology manager with none policy" May 16 16:42:01.909602 kubelet[2528]: I0516 16:42:01.909523 2528 container_manager_linux.go:300] "Creating device plugin manager" May 16 16:42:01.910060 kubelet[2528]: I0516 16:42:01.910052 2528 state_mem.go:36] "Initialized new in-memory state store" May 16 16:42:01.915404 kubelet[2528]: I0516 16:42:01.915395 2528 kubelet.go:408] "Attempting to sync node with API server" May 16 16:42:01.915460 kubelet[2528]: I0516 16:42:01.915455 2528 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 16 16:42:01.918215 kubelet[2528]: I0516 16:42:01.918208 2528 kubelet.go:314] "Adding apiserver pod source" May 16 16:42:01.918267 kubelet[2528]: I0516 16:42:01.918262 2528 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 16 16:42:01.927137 kubelet[2528]: W0516 16:42:01.927102 2528 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused May 16 16:42:01.927366 kubelet[2528]: I0516 16:42:01.927240 2528 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 16 16:42:01.927686 kubelet[2528]: W0516 16:42:01.927496 2528 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused May 16 16:42:01.927686 kubelet[2528]: E0516 16:42:01.927513 2528 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:01.927686 kubelet[2528]: E0516 16:42:01.927530 2528 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:01.929693 kubelet[2528]: I0516 16:42:01.929682 2528 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 16 16:42:01.929746 kubelet[2528]: W0516 16:42:01.929723 2528 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 16 16:42:01.930070 kubelet[2528]: I0516 16:42:01.930060 2528 server.go:1274] "Started kubelet" May 16 16:42:01.930947 kubelet[2528]: I0516 16:42:01.930870 2528 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 16 16:42:01.931660 kubelet[2528]: I0516 16:42:01.931651 2528 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 16 16:42:01.931961 kubelet[2528]: I0516 16:42:01.931937 2528 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 16 16:42:01.932087 kubelet[2528]: I0516 16:42:01.932077 2528 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 16 16:42:01.933266 kubelet[2528]: I0516 16:42:01.933253 2528 server.go:449] "Adding debug handlers to kubelet server" May 16 16:42:01.935662 kubelet[2528]: I0516 16:42:01.935652 2528 volume_manager.go:289] "Starting Kubelet Volume Manager" May 16 16:42:01.935896 kubelet[2528]: E0516 16:42:01.935886 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:01.938546 kubelet[2528]: I0516 16:42:01.938534 2528 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 16 16:42:01.938649 kubelet[2528]: I0516 16:42:01.938642 2528 reconciler.go:26] "Reconciler: start to sync state" May 16 16:42:01.941628 kubelet[2528]: I0516 16:42:01.941608 2528 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 16 16:42:01.945383 kubelet[2528]: E0516 16:42:01.941960 2528 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.106:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.106:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18400f859b48874c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-16 16:42:01.930041164 +0000 UTC m=+0.386619859,LastTimestamp:2025-05-16 16:42:01.930041164 +0000 UTC m=+0.386619859,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 16 16:42:01.945827 kubelet[2528]: E0516 16:42:01.945809 2528 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="200ms" May 16 16:42:01.945955 kubelet[2528]: W0516 16:42:01.945926 2528 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused May 16 16:42:01.946227 kubelet[2528]: E0516 16:42:01.946196 2528 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:01.946267 kubelet[2528]: I0516 16:42:01.946069 2528 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 16 16:42:01.947321 kubelet[2528]: I0516 16:42:01.947312 2528 factory.go:221] Registration of the systemd container factory successfully May 16 16:42:01.947842 kubelet[2528]: I0516 16:42:01.947832 2528 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 16 16:42:01.949706 kubelet[2528]: I0516 16:42:01.949538 2528 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 16 16:42:01.949706 kubelet[2528]: I0516 16:42:01.949551 2528 status_manager.go:217] "Starting to sync pod status with apiserver" May 16 16:42:01.949706 kubelet[2528]: I0516 16:42:01.949562 2528 kubelet.go:2321] "Starting kubelet main sync loop" May 16 16:42:01.949706 kubelet[2528]: E0516 16:42:01.949583 2528 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 16 16:42:01.952233 kubelet[2528]: I0516 16:42:01.952218 2528 factory.go:221] Registration of the containerd container factory successfully May 16 16:42:01.953539 kubelet[2528]: W0516 16:42:01.953514 2528 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused May 16 16:42:01.953621 kubelet[2528]: E0516 16:42:01.953612 2528 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:01.971560 kubelet[2528]: E0516 16:42:01.971544 2528 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 16 16:42:01.974378 kubelet[2528]: I0516 16:42:01.974355 2528 cpu_manager.go:214] "Starting CPU manager" policy="none" May 16 16:42:01.974378 kubelet[2528]: I0516 16:42:01.974367 2528 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 16 16:42:01.974451 kubelet[2528]: I0516 16:42:01.974399 2528 state_mem.go:36] "Initialized new in-memory state store" May 16 16:42:01.976166 kubelet[2528]: I0516 16:42:01.976152 2528 policy_none.go:49] "None policy: Start" May 16 16:42:01.976539 kubelet[2528]: I0516 16:42:01.976489 2528 memory_manager.go:170] "Starting memorymanager" policy="None" May 16 16:42:01.976591 kubelet[2528]: I0516 16:42:01.976551 2528 state_mem.go:35] "Initializing new in-memory state store" May 16 16:42:01.985823 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 16 16:42:01.995630 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 16 16:42:01.998013 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 16 16:42:02.018737 kubelet[2528]: I0516 16:42:02.018578 2528 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 16 16:42:02.018737 kubelet[2528]: I0516 16:42:02.018711 2528 eviction_manager.go:189] "Eviction manager: starting control loop" May 16 16:42:02.018858 kubelet[2528]: I0516 16:42:02.018717 2528 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 16 16:42:02.019103 kubelet[2528]: I0516 16:42:02.019094 2528 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 16 16:42:02.020368 kubelet[2528]: E0516 16:42:02.020354 2528 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 16 16:42:02.059937 systemd[1]: Created slice kubepods-burstable-poda3416600bab1918b24583836301c9096.slice - libcontainer container kubepods-burstable-poda3416600bab1918b24583836301c9096.slice. May 16 16:42:02.072239 systemd[1]: Created slice kubepods-burstable-podea5884ad3481d5218ff4c8f11f2934d5.slice - libcontainer container kubepods-burstable-podea5884ad3481d5218ff4c8f11f2934d5.slice. May 16 16:42:02.093217 systemd[1]: Created slice kubepods-burstable-podd3c12763cdeebb182eaa8c004984432e.slice - libcontainer container kubepods-burstable-podd3c12763cdeebb182eaa8c004984432e.slice. May 16 16:42:02.120510 kubelet[2528]: I0516 16:42:02.120450 2528 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 16 16:42:02.121795 kubelet[2528]: E0516 16:42:02.121781 2528 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" May 16 16:42:02.146466 kubelet[2528]: E0516 16:42:02.146435 2528 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="400ms" May 16 16:42:02.239283 kubelet[2528]: I0516 16:42:02.239247 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:02.239283 kubelet[2528]: I0516 16:42:02.239281 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:02.239411 kubelet[2528]: I0516 16:42:02.239296 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:02.239411 kubelet[2528]: I0516 16:42:02.239308 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d3c12763cdeebb182eaa8c004984432e-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d3c12763cdeebb182eaa8c004984432e\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:02.239411 kubelet[2528]: I0516 16:42:02.239317 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d3c12763cdeebb182eaa8c004984432e-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d3c12763cdeebb182eaa8c004984432e\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:02.239411 kubelet[2528]: I0516 16:42:02.239325 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:02.239411 kubelet[2528]: I0516 16:42:02.239338 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:02.239495 kubelet[2528]: I0516 16:42:02.239348 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ea5884ad3481d5218ff4c8f11f2934d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"ea5884ad3481d5218ff4c8f11f2934d5\") " pod="kube-system/kube-scheduler-localhost" May 16 16:42:02.239495 kubelet[2528]: I0516 16:42:02.239357 2528 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d3c12763cdeebb182eaa8c004984432e-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d3c12763cdeebb182eaa8c004984432e\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:02.323278 kubelet[2528]: I0516 16:42:02.323255 2528 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 16 16:42:02.323504 kubelet[2528]: E0516 16:42:02.323489 2528 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" May 16 16:42:02.372480 containerd[1610]: time="2025-05-16T16:42:02.372403878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a3416600bab1918b24583836301c9096,Namespace:kube-system,Attempt:0,}" May 16 16:42:02.397091 containerd[1610]: time="2025-05-16T16:42:02.396894713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:ea5884ad3481d5218ff4c8f11f2934d5,Namespace:kube-system,Attempt:0,}" May 16 16:42:02.398999 containerd[1610]: time="2025-05-16T16:42:02.398986814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d3c12763cdeebb182eaa8c004984432e,Namespace:kube-system,Attempt:0,}" May 16 16:42:02.474214 containerd[1610]: time="2025-05-16T16:42:02.474184987Z" level=info msg="connecting to shim 11e3dcb3cd2590520ab6ad5bac61493b8934e350429f5f295eb38366f95f5fb1" address="unix:///run/containerd/s/248240060cba9c184af1fc60c4fe27b583b94f37f06000fd1497cf3226b77458" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:02.474354 containerd[1610]: time="2025-05-16T16:42:02.474190115Z" level=info msg="connecting to shim f535773992a7be5c8be1f0d463f4a33077fd1db820b8d0a9ecadbfd5803bbb66" address="unix:///run/containerd/s/089c29949b855583b190a295bb086cdc3bd0fa0bd16ce2a2d52763b2f8fe0742" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:02.517646 containerd[1610]: time="2025-05-16T16:42:02.517610965Z" level=info msg="connecting to shim 739987d611106e4cd0a94567e1f8ea18274ebffc70d1c511fbc310552d4f42cb" address="unix:///run/containerd/s/94646cb38620b9ab0db5b410ad444ba6c6ab2145d2b99815892cf6f7856462de" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:02.547330 kubelet[2528]: E0516 16:42:02.547238 2528 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="800ms" May 16 16:42:02.628852 systemd[1]: Started cri-containerd-739987d611106e4cd0a94567e1f8ea18274ebffc70d1c511fbc310552d4f42cb.scope - libcontainer container 739987d611106e4cd0a94567e1f8ea18274ebffc70d1c511fbc310552d4f42cb. May 16 16:42:02.632953 systemd[1]: Started cri-containerd-11e3dcb3cd2590520ab6ad5bac61493b8934e350429f5f295eb38366f95f5fb1.scope - libcontainer container 11e3dcb3cd2590520ab6ad5bac61493b8934e350429f5f295eb38366f95f5fb1. May 16 16:42:02.636858 systemd[1]: Started cri-containerd-f535773992a7be5c8be1f0d463f4a33077fd1db820b8d0a9ecadbfd5803bbb66.scope - libcontainer container f535773992a7be5c8be1f0d463f4a33077fd1db820b8d0a9ecadbfd5803bbb66. May 16 16:42:02.681025 containerd[1610]: time="2025-05-16T16:42:02.680999359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:ea5884ad3481d5218ff4c8f11f2934d5,Namespace:kube-system,Attempt:0,} returns sandbox id \"11e3dcb3cd2590520ab6ad5bac61493b8934e350429f5f295eb38366f95f5fb1\"" May 16 16:42:02.685032 containerd[1610]: time="2025-05-16T16:42:02.685009686Z" level=info msg="CreateContainer within sandbox \"11e3dcb3cd2590520ab6ad5bac61493b8934e350429f5f295eb38366f95f5fb1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 16 16:42:02.691356 containerd[1610]: time="2025-05-16T16:42:02.691334871Z" level=info msg="Container 566d2039e97b92b4cda4d480c256ca9b43b69a6470a6ca3c74f682d4856703c6: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:02.695592 containerd[1610]: time="2025-05-16T16:42:02.695562278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:a3416600bab1918b24583836301c9096,Namespace:kube-system,Attempt:0,} returns sandbox id \"f535773992a7be5c8be1f0d463f4a33077fd1db820b8d0a9ecadbfd5803bbb66\"" May 16 16:42:02.696398 containerd[1610]: time="2025-05-16T16:42:02.696254425Z" level=info msg="CreateContainer within sandbox \"11e3dcb3cd2590520ab6ad5bac61493b8934e350429f5f295eb38366f95f5fb1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"566d2039e97b92b4cda4d480c256ca9b43b69a6470a6ca3c74f682d4856703c6\"" May 16 16:42:02.697210 containerd[1610]: time="2025-05-16T16:42:02.697146435Z" level=info msg="StartContainer for \"566d2039e97b92b4cda4d480c256ca9b43b69a6470a6ca3c74f682d4856703c6\"" May 16 16:42:02.698974 containerd[1610]: time="2025-05-16T16:42:02.698961742Z" level=info msg="CreateContainer within sandbox \"f535773992a7be5c8be1f0d463f4a33077fd1db820b8d0a9ecadbfd5803bbb66\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 16 16:42:02.700348 containerd[1610]: time="2025-05-16T16:42:02.700335465Z" level=info msg="connecting to shim 566d2039e97b92b4cda4d480c256ca9b43b69a6470a6ca3c74f682d4856703c6" address="unix:///run/containerd/s/248240060cba9c184af1fc60c4fe27b583b94f37f06000fd1497cf3226b77458" protocol=ttrpc version=3 May 16 16:42:02.701565 containerd[1610]: time="2025-05-16T16:42:02.701535788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:d3c12763cdeebb182eaa8c004984432e,Namespace:kube-system,Attempt:0,} returns sandbox id \"739987d611106e4cd0a94567e1f8ea18274ebffc70d1c511fbc310552d4f42cb\"" May 16 16:42:02.702903 containerd[1610]: time="2025-05-16T16:42:02.702885884Z" level=info msg="CreateContainer within sandbox \"739987d611106e4cd0a94567e1f8ea18274ebffc70d1c511fbc310552d4f42cb\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 16 16:42:02.703749 containerd[1610]: time="2025-05-16T16:42:02.703713157Z" level=info msg="Container 8f6b926729ce4f575e66e547c4c796b9b323cb5d26e1deb60076ed1d1b606c8c: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:02.707044 containerd[1610]: time="2025-05-16T16:42:02.706966323Z" level=info msg="CreateContainer within sandbox \"f535773992a7be5c8be1f0d463f4a33077fd1db820b8d0a9ecadbfd5803bbb66\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8f6b926729ce4f575e66e547c4c796b9b323cb5d26e1deb60076ed1d1b606c8c\"" May 16 16:42:02.707389 containerd[1610]: time="2025-05-16T16:42:02.707361818Z" level=info msg="StartContainer for \"8f6b926729ce4f575e66e547c4c796b9b323cb5d26e1deb60076ed1d1b606c8c\"" May 16 16:42:02.708710 containerd[1610]: time="2025-05-16T16:42:02.708404772Z" level=info msg="Container f9dffcd0253a7f7d031dcea4d65b60d64dc80195c529c78b49969aa0e4cefcbd: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:02.713848 containerd[1610]: time="2025-05-16T16:42:02.713825754Z" level=info msg="connecting to shim 8f6b926729ce4f575e66e547c4c796b9b323cb5d26e1deb60076ed1d1b606c8c" address="unix:///run/containerd/s/089c29949b855583b190a295bb086cdc3bd0fa0bd16ce2a2d52763b2f8fe0742" protocol=ttrpc version=3 May 16 16:42:02.720296 containerd[1610]: time="2025-05-16T16:42:02.720263022Z" level=info msg="CreateContainer within sandbox \"739987d611106e4cd0a94567e1f8ea18274ebffc70d1c511fbc310552d4f42cb\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f9dffcd0253a7f7d031dcea4d65b60d64dc80195c529c78b49969aa0e4cefcbd\"" May 16 16:42:02.720619 containerd[1610]: time="2025-05-16T16:42:02.720606726Z" level=info msg="StartContainer for \"f9dffcd0253a7f7d031dcea4d65b60d64dc80195c529c78b49969aa0e4cefcbd\"" May 16 16:42:02.720849 systemd[1]: Started cri-containerd-566d2039e97b92b4cda4d480c256ca9b43b69a6470a6ca3c74f682d4856703c6.scope - libcontainer container 566d2039e97b92b4cda4d480c256ca9b43b69a6470a6ca3c74f682d4856703c6. May 16 16:42:02.722674 containerd[1610]: time="2025-05-16T16:42:02.722646073Z" level=info msg="connecting to shim f9dffcd0253a7f7d031dcea4d65b60d64dc80195c529c78b49969aa0e4cefcbd" address="unix:///run/containerd/s/94646cb38620b9ab0db5b410ad444ba6c6ab2145d2b99815892cf6f7856462de" protocol=ttrpc version=3 May 16 16:42:02.725245 kubelet[2528]: I0516 16:42:02.725228 2528 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 16 16:42:02.725663 kubelet[2528]: E0516 16:42:02.725475 2528 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" May 16 16:42:02.731860 systemd[1]: Started cri-containerd-8f6b926729ce4f575e66e547c4c796b9b323cb5d26e1deb60076ed1d1b606c8c.scope - libcontainer container 8f6b926729ce4f575e66e547c4c796b9b323cb5d26e1deb60076ed1d1b606c8c. May 16 16:42:02.745847 systemd[1]: Started cri-containerd-f9dffcd0253a7f7d031dcea4d65b60d64dc80195c529c78b49969aa0e4cefcbd.scope - libcontainer container f9dffcd0253a7f7d031dcea4d65b60d64dc80195c529c78b49969aa0e4cefcbd. May 16 16:42:02.793421 containerd[1610]: time="2025-05-16T16:42:02.792893692Z" level=info msg="StartContainer for \"f9dffcd0253a7f7d031dcea4d65b60d64dc80195c529c78b49969aa0e4cefcbd\" returns successfully" May 16 16:42:02.803041 containerd[1610]: time="2025-05-16T16:42:02.802961819Z" level=info msg="StartContainer for \"8f6b926729ce4f575e66e547c4c796b9b323cb5d26e1deb60076ed1d1b606c8c\" returns successfully" May 16 16:42:02.803714 containerd[1610]: time="2025-05-16T16:42:02.803694472Z" level=info msg="StartContainer for \"566d2039e97b92b4cda4d480c256ca9b43b69a6470a6ca3c74f682d4856703c6\" returns successfully" May 16 16:42:02.906496 kubelet[2528]: W0516 16:42:02.906043 2528 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused May 16 16:42:02.906496 kubelet[2528]: E0516 16:42:02.906470 2528 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:03.062321 kubelet[2528]: W0516 16:42:03.062248 2528 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused May 16 16:42:03.062321 kubelet[2528]: E0516 16:42:03.062306 2528 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:03.134273 kubelet[2528]: W0516 16:42:03.134209 2528 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused May 16 16:42:03.134273 kubelet[2528]: E0516 16:42:03.134258 2528 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:03.152203 kubelet[2528]: W0516 16:42:03.152165 2528 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused May 16 16:42:03.152298 kubelet[2528]: E0516 16:42:03.152209 2528 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" May 16 16:42:03.526839 kubelet[2528]: I0516 16:42:03.526803 2528 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 16 16:42:03.881068 kubelet[2528]: E0516 16:42:03.880991 2528 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 16 16:42:03.959719 kubelet[2528]: I0516 16:42:03.959622 2528 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 16 16:42:03.959719 kubelet[2528]: E0516 16:42:03.959648 2528 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" May 16 16:42:03.971334 kubelet[2528]: E0516 16:42:03.971308 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.071978 kubelet[2528]: E0516 16:42:04.071957 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.172466 kubelet[2528]: E0516 16:42:04.172435 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.273064 kubelet[2528]: E0516 16:42:04.273031 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.373585 kubelet[2528]: E0516 16:42:04.373554 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.473800 kubelet[2528]: E0516 16:42:04.473671 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.574252 kubelet[2528]: E0516 16:42:04.574221 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.674825 kubelet[2528]: E0516 16:42:04.674796 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.775416 kubelet[2528]: E0516 16:42:04.775340 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.875896 kubelet[2528]: E0516 16:42:04.875864 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:04.976150 kubelet[2528]: E0516 16:42:04.976122 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:05.077069 kubelet[2528]: E0516 16:42:05.076986 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:05.177700 kubelet[2528]: E0516 16:42:05.177672 2528 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:05.924471 kubelet[2528]: I0516 16:42:05.924442 2528 apiserver.go:52] "Watching apiserver" May 16 16:42:05.939460 kubelet[2528]: I0516 16:42:05.939419 2528 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 16 16:42:06.125336 systemd[1]: Reload requested from client PID 2793 ('systemctl') (unit session-9.scope)... May 16 16:42:06.125511 systemd[1]: Reloading... May 16 16:42:06.201763 zram_generator::config[2840]: No configuration found. May 16 16:42:06.273472 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 16:42:06.282462 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 16 16:42:06.358335 systemd[1]: Reloading finished in 232 ms. May 16 16:42:06.375039 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:06.406360 systemd[1]: kubelet.service: Deactivated successfully. May 16 16:42:06.406520 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:06.406558 systemd[1]: kubelet.service: Consumed 528ms CPU time, 128.3M memory peak. May 16 16:42:06.407938 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:07.350467 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:07.358978 (kubelet)[2904]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 16 16:42:07.517090 kubelet[2904]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 16:42:07.517090 kubelet[2904]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 16 16:42:07.517090 kubelet[2904]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 16:42:07.517090 kubelet[2904]: I0516 16:42:07.517067 2904 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 16 16:42:07.522317 kubelet[2904]: I0516 16:42:07.522293 2904 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 16 16:42:07.522317 kubelet[2904]: I0516 16:42:07.522311 2904 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 16 16:42:07.522498 kubelet[2904]: I0516 16:42:07.522485 2904 server.go:934] "Client rotation is on, will bootstrap in background" May 16 16:42:07.523294 kubelet[2904]: I0516 16:42:07.523282 2904 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 16 16:42:07.524568 kubelet[2904]: I0516 16:42:07.524494 2904 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 16 16:42:07.538408 kubelet[2904]: I0516 16:42:07.538396 2904 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 16 16:42:07.539971 kubelet[2904]: I0516 16:42:07.539917 2904 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 16 16:42:07.540024 kubelet[2904]: I0516 16:42:07.539981 2904 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 16 16:42:07.540048 kubelet[2904]: I0516 16:42:07.540035 2904 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 16 16:42:07.540148 kubelet[2904]: I0516 16:42:07.540051 2904 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 16 16:42:07.540202 kubelet[2904]: I0516 16:42:07.540150 2904 topology_manager.go:138] "Creating topology manager with none policy" May 16 16:42:07.540202 kubelet[2904]: I0516 16:42:07.540157 2904 container_manager_linux.go:300] "Creating device plugin manager" May 16 16:42:07.540202 kubelet[2904]: I0516 16:42:07.540173 2904 state_mem.go:36] "Initialized new in-memory state store" May 16 16:42:07.540692 kubelet[2904]: I0516 16:42:07.540277 2904 kubelet.go:408] "Attempting to sync node with API server" May 16 16:42:07.540692 kubelet[2904]: I0516 16:42:07.540284 2904 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 16 16:42:07.540692 kubelet[2904]: I0516 16:42:07.540300 2904 kubelet.go:314] "Adding apiserver pod source" May 16 16:42:07.540692 kubelet[2904]: I0516 16:42:07.540305 2904 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 16 16:42:07.545820 kubelet[2904]: I0516 16:42:07.545803 2904 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 16 16:42:07.546265 kubelet[2904]: I0516 16:42:07.546252 2904 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 16 16:42:07.558822 kubelet[2904]: I0516 16:42:07.557734 2904 server.go:1274] "Started kubelet" May 16 16:42:07.559900 kubelet[2904]: I0516 16:42:07.559819 2904 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 16 16:42:07.561721 kubelet[2904]: I0516 16:42:07.561701 2904 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 16 16:42:07.561924 kubelet[2904]: I0516 16:42:07.561909 2904 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 16 16:42:07.562098 kubelet[2904]: I0516 16:42:07.562090 2904 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 16 16:42:07.562263 kubelet[2904]: I0516 16:42:07.562254 2904 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 16 16:42:07.562865 kubelet[2904]: I0516 16:42:07.562857 2904 server.go:449] "Adding debug handlers to kubelet server" May 16 16:42:07.564333 kubelet[2904]: I0516 16:42:07.564298 2904 volume_manager.go:289] "Starting Kubelet Volume Manager" May 16 16:42:07.564397 kubelet[2904]: I0516 16:42:07.564355 2904 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 16 16:42:07.564433 kubelet[2904]: I0516 16:42:07.564427 2904 reconciler.go:26] "Reconciler: start to sync state" May 16 16:42:07.566094 kubelet[2904]: I0516 16:42:07.566080 2904 factory.go:221] Registration of the systemd container factory successfully May 16 16:42:07.566175 kubelet[2904]: I0516 16:42:07.566156 2904 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 16 16:42:07.566715 kubelet[2904]: E0516 16:42:07.566637 2904 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 16 16:42:07.567448 kubelet[2904]: I0516 16:42:07.567434 2904 factory.go:221] Registration of the containerd container factory successfully May 16 16:42:07.575858 kubelet[2904]: I0516 16:42:07.575784 2904 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 16 16:42:07.576933 kubelet[2904]: I0516 16:42:07.576790 2904 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 16 16:42:07.576933 kubelet[2904]: I0516 16:42:07.576804 2904 status_manager.go:217] "Starting to sync pod status with apiserver" May 16 16:42:07.576933 kubelet[2904]: I0516 16:42:07.576816 2904 kubelet.go:2321] "Starting kubelet main sync loop" May 16 16:42:07.576933 kubelet[2904]: E0516 16:42:07.576838 2904 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 16 16:42:07.603920 kubelet[2904]: I0516 16:42:07.603862 2904 cpu_manager.go:214] "Starting CPU manager" policy="none" May 16 16:42:07.603920 kubelet[2904]: I0516 16:42:07.603879 2904 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 16 16:42:07.603920 kubelet[2904]: I0516 16:42:07.603892 2904 state_mem.go:36] "Initialized new in-memory state store" May 16 16:42:07.604161 kubelet[2904]: I0516 16:42:07.604140 2904 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 16 16:42:07.604161 kubelet[2904]: I0516 16:42:07.604149 2904 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 16 16:42:07.604161 kubelet[2904]: I0516 16:42:07.604161 2904 policy_none.go:49] "None policy: Start" May 16 16:42:07.604894 kubelet[2904]: I0516 16:42:07.604880 2904 memory_manager.go:170] "Starting memorymanager" policy="None" May 16 16:42:07.604894 kubelet[2904]: I0516 16:42:07.604894 2904 state_mem.go:35] "Initializing new in-memory state store" May 16 16:42:07.605032 kubelet[2904]: I0516 16:42:07.605021 2904 state_mem.go:75] "Updated machine memory state" May 16 16:42:07.607880 kubelet[2904]: I0516 16:42:07.607866 2904 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 16 16:42:07.607976 kubelet[2904]: I0516 16:42:07.607961 2904 eviction_manager.go:189] "Eviction manager: starting control loop" May 16 16:42:07.608007 kubelet[2904]: I0516 16:42:07.607970 2904 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 16 16:42:07.608091 kubelet[2904]: I0516 16:42:07.608081 2904 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 16 16:42:07.682797 kubelet[2904]: E0516 16:42:07.682771 2904 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" May 16 16:42:07.711571 kubelet[2904]: I0516 16:42:07.711549 2904 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 16 16:42:07.716431 kubelet[2904]: I0516 16:42:07.716414 2904 kubelet_node_status.go:111] "Node was previously registered" node="localhost" May 16 16:42:07.716554 kubelet[2904]: I0516 16:42:07.716548 2904 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 16 16:42:07.880399 kubelet[2904]: I0516 16:42:07.866591 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:07.880399 kubelet[2904]: I0516 16:42:07.866620 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ea5884ad3481d5218ff4c8f11f2934d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"ea5884ad3481d5218ff4c8f11f2934d5\") " pod="kube-system/kube-scheduler-localhost" May 16 16:42:07.880399 kubelet[2904]: I0516 16:42:07.866632 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d3c12763cdeebb182eaa8c004984432e-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"d3c12763cdeebb182eaa8c004984432e\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:07.880399 kubelet[2904]: I0516 16:42:07.866642 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:07.880399 kubelet[2904]: I0516 16:42:07.866651 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:07.880543 kubelet[2904]: I0516 16:42:07.866660 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:07.880543 kubelet[2904]: I0516 16:42:07.866669 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d3c12763cdeebb182eaa8c004984432e-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"d3c12763cdeebb182eaa8c004984432e\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:07.880543 kubelet[2904]: I0516 16:42:07.866678 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d3c12763cdeebb182eaa8c004984432e-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"d3c12763cdeebb182eaa8c004984432e\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:07.880543 kubelet[2904]: I0516 16:42:07.866686 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a3416600bab1918b24583836301c9096-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"a3416600bab1918b24583836301c9096\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:08.555913 kubelet[2904]: I0516 16:42:08.555874 2904 apiserver.go:52] "Watching apiserver" May 16 16:42:08.565032 kubelet[2904]: I0516 16:42:08.564949 2904 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 16 16:42:08.597559 kubelet[2904]: E0516 16:42:08.597521 2904 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 16 16:42:08.611870 kubelet[2904]: I0516 16:42:08.611751 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.6117380209999999 podStartE2EDuration="1.611738021s" podCreationTimestamp="2025-05-16 16:42:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:08.607192169 +0000 UTC m=+1.157443704" watchObservedRunningTime="2025-05-16 16:42:08.611738021 +0000 UTC m=+1.161989549" May 16 16:42:08.620070 kubelet[2904]: I0516 16:42:08.619788 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.6197747169999999 podStartE2EDuration="1.619774717s" podCreationTimestamp="2025-05-16 16:42:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:08.611976062 +0000 UTC m=+1.162227599" watchObservedRunningTime="2025-05-16 16:42:08.619774717 +0000 UTC m=+1.170026246" May 16 16:42:10.675153 kubelet[2904]: I0516 16:42:10.675130 2904 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 16 16:42:10.675831 containerd[1610]: time="2025-05-16T16:42:10.675473329Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 16 16:42:10.675968 kubelet[2904]: I0516 16:42:10.675570 2904 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 16 16:42:11.628602 kubelet[2904]: I0516 16:42:11.627959 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=6.627940188 podStartE2EDuration="6.627940188s" podCreationTimestamp="2025-05-16 16:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:08.619951001 +0000 UTC m=+1.170202537" watchObservedRunningTime="2025-05-16 16:42:11.627940188 +0000 UTC m=+4.178191714" May 16 16:42:11.638683 systemd[1]: Created slice kubepods-besteffort-pod7d0eeba3_3b42_4441_bfb2_81acbc94c50a.slice - libcontainer container kubepods-besteffort-pod7d0eeba3_3b42_4441_bfb2_81acbc94c50a.slice. May 16 16:42:11.689934 kubelet[2904]: I0516 16:42:11.689872 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7d0eeba3-3b42-4441-bfb2-81acbc94c50a-kube-proxy\") pod \"kube-proxy-w9m79\" (UID: \"7d0eeba3-3b42-4441-bfb2-81acbc94c50a\") " pod="kube-system/kube-proxy-w9m79" May 16 16:42:11.690174 kubelet[2904]: I0516 16:42:11.689942 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7d0eeba3-3b42-4441-bfb2-81acbc94c50a-xtables-lock\") pod \"kube-proxy-w9m79\" (UID: \"7d0eeba3-3b42-4441-bfb2-81acbc94c50a\") " pod="kube-system/kube-proxy-w9m79" May 16 16:42:11.690174 kubelet[2904]: I0516 16:42:11.689954 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7d0eeba3-3b42-4441-bfb2-81acbc94c50a-lib-modules\") pod \"kube-proxy-w9m79\" (UID: \"7d0eeba3-3b42-4441-bfb2-81acbc94c50a\") " pod="kube-system/kube-proxy-w9m79" May 16 16:42:11.690174 kubelet[2904]: I0516 16:42:11.689965 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x64j\" (UniqueName: \"kubernetes.io/projected/7d0eeba3-3b42-4441-bfb2-81acbc94c50a-kube-api-access-5x64j\") pod \"kube-proxy-w9m79\" (UID: \"7d0eeba3-3b42-4441-bfb2-81acbc94c50a\") " pod="kube-system/kube-proxy-w9m79" May 16 16:42:11.745313 systemd[1]: Created slice kubepods-besteffort-pod866c0fc2_ca7b_4b86_9e56_962e06d2c949.slice - libcontainer container kubepods-besteffort-pod866c0fc2_ca7b_4b86_9e56_962e06d2c949.slice. May 16 16:42:11.790597 kubelet[2904]: I0516 16:42:11.790566 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt42f\" (UniqueName: \"kubernetes.io/projected/866c0fc2-ca7b-4b86-9e56-962e06d2c949-kube-api-access-lt42f\") pod \"tigera-operator-7c5755cdcb-x4htb\" (UID: \"866c0fc2-ca7b-4b86-9e56-962e06d2c949\") " pod="tigera-operator/tigera-operator-7c5755cdcb-x4htb" May 16 16:42:11.791353 kubelet[2904]: I0516 16:42:11.791020 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/866c0fc2-ca7b-4b86-9e56-962e06d2c949-var-lib-calico\") pod \"tigera-operator-7c5755cdcb-x4htb\" (UID: \"866c0fc2-ca7b-4b86-9e56-962e06d2c949\") " pod="tigera-operator/tigera-operator-7c5755cdcb-x4htb" May 16 16:42:11.945762 containerd[1610]: time="2025-05-16T16:42:11.945705798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-w9m79,Uid:7d0eeba3-3b42-4441-bfb2-81acbc94c50a,Namespace:kube-system,Attempt:0,}" May 16 16:42:11.959648 containerd[1610]: time="2025-05-16T16:42:11.958885743Z" level=info msg="connecting to shim 4c3d748066c38a13f5fee5296df30cb8c2593b37a5ac16cf6d715780567a6420" address="unix:///run/containerd/s/43150e62708b08262c27f72cc46a584aab4478e4a095064b8a08f3d4b5292ac3" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:11.981858 systemd[1]: Started cri-containerd-4c3d748066c38a13f5fee5296df30cb8c2593b37a5ac16cf6d715780567a6420.scope - libcontainer container 4c3d748066c38a13f5fee5296df30cb8c2593b37a5ac16cf6d715780567a6420. May 16 16:42:11.998051 containerd[1610]: time="2025-05-16T16:42:11.998023611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-w9m79,Uid:7d0eeba3-3b42-4441-bfb2-81acbc94c50a,Namespace:kube-system,Attempt:0,} returns sandbox id \"4c3d748066c38a13f5fee5296df30cb8c2593b37a5ac16cf6d715780567a6420\"" May 16 16:42:12.004539 containerd[1610]: time="2025-05-16T16:42:12.003067751Z" level=info msg="CreateContainer within sandbox \"4c3d748066c38a13f5fee5296df30cb8c2593b37a5ac16cf6d715780567a6420\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 16 16:42:12.010275 containerd[1610]: time="2025-05-16T16:42:12.010248814Z" level=info msg="Container 53b1e7b484fc968b99c9a4b053afd9bc6054134a7d93811ebb74f35f93fb672d: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:12.014084 containerd[1610]: time="2025-05-16T16:42:12.014055549Z" level=info msg="CreateContainer within sandbox \"4c3d748066c38a13f5fee5296df30cb8c2593b37a5ac16cf6d715780567a6420\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"53b1e7b484fc968b99c9a4b053afd9bc6054134a7d93811ebb74f35f93fb672d\"" May 16 16:42:12.014779 containerd[1610]: time="2025-05-16T16:42:12.014761693Z" level=info msg="StartContainer for \"53b1e7b484fc968b99c9a4b053afd9bc6054134a7d93811ebb74f35f93fb672d\"" May 16 16:42:12.015698 containerd[1610]: time="2025-05-16T16:42:12.015681349Z" level=info msg="connecting to shim 53b1e7b484fc968b99c9a4b053afd9bc6054134a7d93811ebb74f35f93fb672d" address="unix:///run/containerd/s/43150e62708b08262c27f72cc46a584aab4478e4a095064b8a08f3d4b5292ac3" protocol=ttrpc version=3 May 16 16:42:12.031362 systemd[1]: Started cri-containerd-53b1e7b484fc968b99c9a4b053afd9bc6054134a7d93811ebb74f35f93fb672d.scope - libcontainer container 53b1e7b484fc968b99c9a4b053afd9bc6054134a7d93811ebb74f35f93fb672d. May 16 16:42:12.047437 containerd[1610]: time="2025-05-16T16:42:12.047409704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-x4htb,Uid:866c0fc2-ca7b-4b86-9e56-962e06d2c949,Namespace:tigera-operator,Attempt:0,}" May 16 16:42:12.056344 containerd[1610]: time="2025-05-16T16:42:12.056308244Z" level=info msg="StartContainer for \"53b1e7b484fc968b99c9a4b053afd9bc6054134a7d93811ebb74f35f93fb672d\" returns successfully" May 16 16:42:12.060679 containerd[1610]: time="2025-05-16T16:42:12.060628994Z" level=info msg="connecting to shim 3ec2c2bbee884a8b52cd2eb5be42d93f4fa1b3960c6b96beca8fef843de7e275" address="unix:///run/containerd/s/ec6d6f34785e664e189a3bb4aa2f24808d8332b02e41d4ef454e4368508c01d7" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:12.073938 systemd[1]: Started cri-containerd-3ec2c2bbee884a8b52cd2eb5be42d93f4fa1b3960c6b96beca8fef843de7e275.scope - libcontainer container 3ec2c2bbee884a8b52cd2eb5be42d93f4fa1b3960c6b96beca8fef843de7e275. May 16 16:42:12.110040 containerd[1610]: time="2025-05-16T16:42:12.110009999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-x4htb,Uid:866c0fc2-ca7b-4b86-9e56-962e06d2c949,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3ec2c2bbee884a8b52cd2eb5be42d93f4fa1b3960c6b96beca8fef843de7e275\"" May 16 16:42:12.111864 containerd[1610]: time="2025-05-16T16:42:12.111833264Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 16 16:42:12.799318 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount914106772.mount: Deactivated successfully. May 16 16:42:13.487166 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2891698791.mount: Deactivated successfully. May 16 16:42:14.266186 containerd[1610]: time="2025-05-16T16:42:14.266098141Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:14.266924 containerd[1610]: time="2025-05-16T16:42:14.266896372Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 16 16:42:14.266995 containerd[1610]: time="2025-05-16T16:42:14.266981307Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:14.268886 containerd[1610]: time="2025-05-16T16:42:14.268852134Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:14.269576 containerd[1610]: time="2025-05-16T16:42:14.269552065Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 2.157694598s" May 16 16:42:14.269576 containerd[1610]: time="2025-05-16T16:42:14.269573156Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 16 16:42:14.272652 containerd[1610]: time="2025-05-16T16:42:14.272152854Z" level=info msg="CreateContainer within sandbox \"3ec2c2bbee884a8b52cd2eb5be42d93f4fa1b3960c6b96beca8fef843de7e275\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 16 16:42:14.279833 containerd[1610]: time="2025-05-16T16:42:14.279314703Z" level=info msg="Container a4f4db9bc58e69685a9e2decda1ec709b7439a8f7efe6f2190f433f08bbc3b9c: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:14.288749 containerd[1610]: time="2025-05-16T16:42:14.288688576Z" level=info msg="CreateContainer within sandbox \"3ec2c2bbee884a8b52cd2eb5be42d93f4fa1b3960c6b96beca8fef843de7e275\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a4f4db9bc58e69685a9e2decda1ec709b7439a8f7efe6f2190f433f08bbc3b9c\"" May 16 16:42:14.289226 containerd[1610]: time="2025-05-16T16:42:14.289209453Z" level=info msg="StartContainer for \"a4f4db9bc58e69685a9e2decda1ec709b7439a8f7efe6f2190f433f08bbc3b9c\"" May 16 16:42:14.290791 containerd[1610]: time="2025-05-16T16:42:14.290689891Z" level=info msg="connecting to shim a4f4db9bc58e69685a9e2decda1ec709b7439a8f7efe6f2190f433f08bbc3b9c" address="unix:///run/containerd/s/ec6d6f34785e664e189a3bb4aa2f24808d8332b02e41d4ef454e4368508c01d7" protocol=ttrpc version=3 May 16 16:42:14.315909 systemd[1]: Started cri-containerd-a4f4db9bc58e69685a9e2decda1ec709b7439a8f7efe6f2190f433f08bbc3b9c.scope - libcontainer container a4f4db9bc58e69685a9e2decda1ec709b7439a8f7efe6f2190f433f08bbc3b9c. May 16 16:42:14.338032 containerd[1610]: time="2025-05-16T16:42:14.338006502Z" level=info msg="StartContainer for \"a4f4db9bc58e69685a9e2decda1ec709b7439a8f7efe6f2190f433f08bbc3b9c\" returns successfully" May 16 16:42:14.610423 kubelet[2904]: I0516 16:42:14.610329 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-w9m79" podStartSLOduration=3.610313274 podStartE2EDuration="3.610313274s" podCreationTimestamp="2025-05-16 16:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:12.612691148 +0000 UTC m=+5.162942684" watchObservedRunningTime="2025-05-16 16:42:14.610313274 +0000 UTC m=+7.160564816" May 16 16:42:19.090015 kubelet[2904]: I0516 16:42:19.089928 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7c5755cdcb-x4htb" podStartSLOduration=5.9306074330000005 podStartE2EDuration="8.089917009s" podCreationTimestamp="2025-05-16 16:42:11 +0000 UTC" firstStartedPulling="2025-05-16 16:42:12.110776357 +0000 UTC m=+4.661027882" lastFinishedPulling="2025-05-16 16:42:14.270085932 +0000 UTC m=+6.820337458" observedRunningTime="2025-05-16 16:42:14.61122744 +0000 UTC m=+7.161478971" watchObservedRunningTime="2025-05-16 16:42:19.089917009 +0000 UTC m=+11.640168544" May 16 16:42:19.726150 sudo[1945]: pam_unix(sudo:session): session closed for user root May 16 16:42:19.727527 sshd[1944]: Connection closed by 147.75.109.163 port 54664 May 16 16:42:19.728270 sshd-session[1942]: pam_unix(sshd:session): session closed for user core May 16 16:42:19.731127 systemd-logind[1588]: Session 9 logged out. Waiting for processes to exit. May 16 16:42:19.731909 systemd[1]: sshd@6-139.178.70.106:22-147.75.109.163:54664.service: Deactivated successfully. May 16 16:42:19.734280 systemd[1]: session-9.scope: Deactivated successfully. May 16 16:42:19.735025 systemd[1]: session-9.scope: Consumed 3.237s CPU time, 151.3M memory peak. May 16 16:42:19.737632 systemd-logind[1588]: Removed session 9. May 16 16:42:22.597766 systemd[1]: Created slice kubepods-besteffort-pod2347aba5_a748_469f_8f07_8d49574b274e.slice - libcontainer container kubepods-besteffort-pod2347aba5_a748_469f_8f07_8d49574b274e.slice. May 16 16:42:22.662139 kubelet[2904]: I0516 16:42:22.662116 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/2347aba5-a748-469f-8f07-8d49574b274e-typha-certs\") pod \"calico-typha-859ddc4fd9-n8z7f\" (UID: \"2347aba5-a748-469f-8f07-8d49574b274e\") " pod="calico-system/calico-typha-859ddc4fd9-n8z7f" May 16 16:42:22.662453 kubelet[2904]: I0516 16:42:22.662405 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfq9j\" (UniqueName: \"kubernetes.io/projected/2347aba5-a748-469f-8f07-8d49574b274e-kube-api-access-wfq9j\") pod \"calico-typha-859ddc4fd9-n8z7f\" (UID: \"2347aba5-a748-469f-8f07-8d49574b274e\") " pod="calico-system/calico-typha-859ddc4fd9-n8z7f" May 16 16:42:22.662453 kubelet[2904]: I0516 16:42:22.662429 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2347aba5-a748-469f-8f07-8d49574b274e-tigera-ca-bundle\") pod \"calico-typha-859ddc4fd9-n8z7f\" (UID: \"2347aba5-a748-469f-8f07-8d49574b274e\") " pod="calico-system/calico-typha-859ddc4fd9-n8z7f" May 16 16:42:22.901645 containerd[1610]: time="2025-05-16T16:42:22.901441220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-859ddc4fd9-n8z7f,Uid:2347aba5-a748-469f-8f07-8d49574b274e,Namespace:calico-system,Attempt:0,}" May 16 16:42:23.001992 containerd[1610]: time="2025-05-16T16:42:23.001964724Z" level=info msg="connecting to shim e29da737d922772305eaaf97a4ce80ad8d6e69853f068ebbb3e8a1267a344e35" address="unix:///run/containerd/s/93f081e7242521a68cd5b57295af6a65a7bc6e16227b9e5a4752c3787da22c00" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:23.023839 systemd[1]: Started cri-containerd-e29da737d922772305eaaf97a4ce80ad8d6e69853f068ebbb3e8a1267a344e35.scope - libcontainer container e29da737d922772305eaaf97a4ce80ad8d6e69853f068ebbb3e8a1267a344e35. May 16 16:42:23.050473 systemd[1]: Created slice kubepods-besteffort-podff045956_f684_4691_9569_cf922b9398d9.slice - libcontainer container kubepods-besteffort-podff045956_f684_4691_9569_cf922b9398d9.slice. May 16 16:42:23.064048 kubelet[2904]: I0516 16:42:23.064019 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-cni-bin-dir\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064048 kubelet[2904]: I0516 16:42:23.064041 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff045956-f684-4691-9569-cf922b9398d9-tigera-ca-bundle\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064048 kubelet[2904]: I0516 16:42:23.064051 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-policysync\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064164 kubelet[2904]: I0516 16:42:23.064060 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-var-lib-calico\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064164 kubelet[2904]: I0516 16:42:23.064069 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ff045956-f684-4691-9569-cf922b9398d9-node-certs\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064164 kubelet[2904]: I0516 16:42:23.064080 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-cni-log-dir\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064164 kubelet[2904]: I0516 16:42:23.064087 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-lib-modules\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064164 kubelet[2904]: I0516 16:42:23.064098 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw46k\" (UniqueName: \"kubernetes.io/projected/ff045956-f684-4691-9569-cf922b9398d9-kube-api-access-tw46k\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064246 kubelet[2904]: I0516 16:42:23.064109 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-var-run-calico\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064246 kubelet[2904]: I0516 16:42:23.064119 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-cni-net-dir\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064246 kubelet[2904]: I0516 16:42:23.064129 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-flexvol-driver-host\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.064246 kubelet[2904]: I0516 16:42:23.064138 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ff045956-f684-4691-9569-cf922b9398d9-xtables-lock\") pod \"calico-node-njv75\" (UID: \"ff045956-f684-4691-9569-cf922b9398d9\") " pod="calico-system/calico-node-njv75" May 16 16:42:23.079303 containerd[1610]: time="2025-05-16T16:42:23.079276510Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-859ddc4fd9-n8z7f,Uid:2347aba5-a748-469f-8f07-8d49574b274e,Namespace:calico-system,Attempt:0,} returns sandbox id \"e29da737d922772305eaaf97a4ce80ad8d6e69853f068ebbb3e8a1267a344e35\"" May 16 16:42:23.080357 containerd[1610]: time="2025-05-16T16:42:23.080306907Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 16 16:42:23.165497 kubelet[2904]: E0516 16:42:23.165434 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.165497 kubelet[2904]: W0516 16:42:23.165447 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.165497 kubelet[2904]: E0516 16:42:23.165460 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.165710 kubelet[2904]: E0516 16:42:23.165649 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.165710 kubelet[2904]: W0516 16:42:23.165658 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.165710 kubelet[2904]: E0516 16:42:23.165666 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.165859 kubelet[2904]: E0516 16:42:23.165853 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.165892 kubelet[2904]: W0516 16:42:23.165887 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.165924 kubelet[2904]: E0516 16:42:23.165919 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.166079 kubelet[2904]: E0516 16:42:23.166033 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.166079 kubelet[2904]: W0516 16:42:23.166039 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.166079 kubelet[2904]: E0516 16:42:23.166044 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.166182 kubelet[2904]: E0516 16:42:23.166176 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.166219 kubelet[2904]: W0516 16:42:23.166213 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.168240 kubelet[2904]: E0516 16:42:23.166248 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.168429 kubelet[2904]: E0516 16:42:23.168373 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.168429 kubelet[2904]: W0516 16:42:23.168382 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.168429 kubelet[2904]: E0516 16:42:23.168390 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.168539 kubelet[2904]: E0516 16:42:23.168533 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.168579 kubelet[2904]: W0516 16:42:23.168573 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.168612 kubelet[2904]: E0516 16:42:23.168607 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.168798 kubelet[2904]: E0516 16:42:23.168749 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.168798 kubelet[2904]: W0516 16:42:23.168755 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.168798 kubelet[2904]: E0516 16:42:23.168761 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.168897 kubelet[2904]: E0516 16:42:23.168892 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.168931 kubelet[2904]: W0516 16:42:23.168926 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.168962 kubelet[2904]: E0516 16:42:23.168957 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.169089 kubelet[2904]: E0516 16:42:23.169045 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.169089 kubelet[2904]: W0516 16:42:23.169051 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.169089 kubelet[2904]: E0516 16:42:23.169056 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.169178 kubelet[2904]: E0516 16:42:23.169172 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.169309 kubelet[2904]: W0516 16:42:23.169208 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.169309 kubelet[2904]: E0516 16:42:23.169218 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.169791 kubelet[2904]: E0516 16:42:23.169784 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.170197 kubelet[2904]: W0516 16:42:23.169834 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.170197 kubelet[2904]: E0516 16:42:23.169845 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.170197 kubelet[2904]: E0516 16:42:23.169934 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.170197 kubelet[2904]: W0516 16:42:23.169942 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.170197 kubelet[2904]: E0516 16:42:23.169949 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.185109 kubelet[2904]: E0516 16:42:23.185092 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.185230 kubelet[2904]: W0516 16:42:23.185193 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.185230 kubelet[2904]: E0516 16:42:23.185208 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.347407 kubelet[2904]: E0516 16:42:23.347373 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czjpr" podUID="e2f3a294-0f1d-4058-aea4-a5b3b7a443f1" May 16 16:42:23.353256 containerd[1610]: time="2025-05-16T16:42:23.352910052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-njv75,Uid:ff045956-f684-4691-9569-cf922b9398d9,Namespace:calico-system,Attempt:0,}" May 16 16:42:23.356298 kubelet[2904]: E0516 16:42:23.356278 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.356298 kubelet[2904]: W0516 16:42:23.356297 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.356514 kubelet[2904]: E0516 16:42:23.356315 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.356514 kubelet[2904]: E0516 16:42:23.356456 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.356514 kubelet[2904]: W0516 16:42:23.356463 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.356514 kubelet[2904]: E0516 16:42:23.356476 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.356578 kubelet[2904]: E0516 16:42:23.356572 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.356578 kubelet[2904]: W0516 16:42:23.356577 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.356611 kubelet[2904]: E0516 16:42:23.356583 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.356821 kubelet[2904]: E0516 16:42:23.356656 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.356821 kubelet[2904]: W0516 16:42:23.356662 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.356821 kubelet[2904]: E0516 16:42:23.356667 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.356821 kubelet[2904]: E0516 16:42:23.356756 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.356821 kubelet[2904]: W0516 16:42:23.356762 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.356821 kubelet[2904]: E0516 16:42:23.356767 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.356998 kubelet[2904]: E0516 16:42:23.356850 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.356998 kubelet[2904]: W0516 16:42:23.356856 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.356998 kubelet[2904]: E0516 16:42:23.356863 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.356998 kubelet[2904]: E0516 16:42:23.356968 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.356998 kubelet[2904]: W0516 16:42:23.356974 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.356998 kubelet[2904]: E0516 16:42:23.356982 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357178 kubelet[2904]: E0516 16:42:23.357066 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357178 kubelet[2904]: W0516 16:42:23.357071 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357178 kubelet[2904]: E0516 16:42:23.357077 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357178 kubelet[2904]: E0516 16:42:23.357167 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357178 kubelet[2904]: W0516 16:42:23.357171 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357178 kubelet[2904]: E0516 16:42:23.357176 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357356 kubelet[2904]: E0516 16:42:23.357258 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357356 kubelet[2904]: W0516 16:42:23.357273 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357356 kubelet[2904]: E0516 16:42:23.357280 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357479 kubelet[2904]: E0516 16:42:23.357367 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357479 kubelet[2904]: W0516 16:42:23.357372 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357479 kubelet[2904]: E0516 16:42:23.357377 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357479 kubelet[2904]: E0516 16:42:23.357462 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357479 kubelet[2904]: W0516 16:42:23.357468 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357479 kubelet[2904]: E0516 16:42:23.357473 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357606 kubelet[2904]: E0516 16:42:23.357587 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357606 kubelet[2904]: W0516 16:42:23.357593 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357606 kubelet[2904]: E0516 16:42:23.357600 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357711 kubelet[2904]: E0516 16:42:23.357690 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357711 kubelet[2904]: W0516 16:42:23.357696 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357711 kubelet[2904]: E0516 16:42:23.357702 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357845 kubelet[2904]: E0516 16:42:23.357820 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357845 kubelet[2904]: W0516 16:42:23.357825 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357845 kubelet[2904]: E0516 16:42:23.357830 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.357996 kubelet[2904]: E0516 16:42:23.357907 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.357996 kubelet[2904]: W0516 16:42:23.357913 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.357996 kubelet[2904]: E0516 16:42:23.357920 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.358081 kubelet[2904]: E0516 16:42:23.358010 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.358081 kubelet[2904]: W0516 16:42:23.358016 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.358081 kubelet[2904]: E0516 16:42:23.358022 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.361881 kubelet[2904]: E0516 16:42:23.358114 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.361881 kubelet[2904]: W0516 16:42:23.358131 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.361881 kubelet[2904]: E0516 16:42:23.358138 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.361881 kubelet[2904]: E0516 16:42:23.358238 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.361881 kubelet[2904]: W0516 16:42:23.358245 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.361881 kubelet[2904]: E0516 16:42:23.358252 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.361881 kubelet[2904]: E0516 16:42:23.358350 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.361881 kubelet[2904]: W0516 16:42:23.358375 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.361881 kubelet[2904]: E0516 16:42:23.358381 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.366447 kubelet[2904]: E0516 16:42:23.366425 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.366447 kubelet[2904]: W0516 16:42:23.366441 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.366680 kubelet[2904]: E0516 16:42:23.366455 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.366680 kubelet[2904]: I0516 16:42:23.366632 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mrw8\" (UniqueName: \"kubernetes.io/projected/e2f3a294-0f1d-4058-aea4-a5b3b7a443f1-kube-api-access-8mrw8\") pod \"csi-node-driver-czjpr\" (UID: \"e2f3a294-0f1d-4058-aea4-a5b3b7a443f1\") " pod="calico-system/csi-node-driver-czjpr" May 16 16:42:23.366680 kubelet[2904]: E0516 16:42:23.366677 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.366877 kubelet[2904]: W0516 16:42:23.366682 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.366877 kubelet[2904]: E0516 16:42:23.366691 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.366877 kubelet[2904]: E0516 16:42:23.366801 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.366877 kubelet[2904]: W0516 16:42:23.366806 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.366877 kubelet[2904]: E0516 16:42:23.366812 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.367034 kubelet[2904]: E0516 16:42:23.367012 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.367034 kubelet[2904]: W0516 16:42:23.367020 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.367034 kubelet[2904]: E0516 16:42:23.367028 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.367244 kubelet[2904]: I0516 16:42:23.367069 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2f3a294-0f1d-4058-aea4-a5b3b7a443f1-kubelet-dir\") pod \"csi-node-driver-czjpr\" (UID: \"e2f3a294-0f1d-4058-aea4-a5b3b7a443f1\") " pod="calico-system/csi-node-driver-czjpr" May 16 16:42:23.368181 kubelet[2904]: E0516 16:42:23.368163 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.368226 kubelet[2904]: W0516 16:42:23.368220 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.368368 kubelet[2904]: E0516 16:42:23.368232 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.368368 kubelet[2904]: I0516 16:42:23.368243 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e2f3a294-0f1d-4058-aea4-a5b3b7a443f1-registration-dir\") pod \"csi-node-driver-czjpr\" (UID: \"e2f3a294-0f1d-4058-aea4-a5b3b7a443f1\") " pod="calico-system/csi-node-driver-czjpr" May 16 16:42:23.368677 kubelet[2904]: E0516 16:42:23.368655 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.368819 kubelet[2904]: W0516 16:42:23.368722 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.369251 kubelet[2904]: E0516 16:42:23.369239 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.369425 kubelet[2904]: E0516 16:42:23.369418 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.369531 kubelet[2904]: W0516 16:42:23.369464 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.369531 kubelet[2904]: E0516 16:42:23.369472 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.369641 kubelet[2904]: E0516 16:42:23.369634 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.369746 kubelet[2904]: W0516 16:42:23.369678 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.369746 kubelet[2904]: E0516 16:42:23.369686 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.369746 kubelet[2904]: I0516 16:42:23.369699 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e2f3a294-0f1d-4058-aea4-a5b3b7a443f1-varrun\") pod \"csi-node-driver-czjpr\" (UID: \"e2f3a294-0f1d-4058-aea4-a5b3b7a443f1\") " pod="calico-system/csi-node-driver-czjpr" May 16 16:42:23.369967 kubelet[2904]: E0516 16:42:23.369940 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.369967 kubelet[2904]: W0516 16:42:23.369949 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.369967 kubelet[2904]: E0516 16:42:23.369958 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.371669 kubelet[2904]: E0516 16:42:23.370694 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.371669 kubelet[2904]: W0516 16:42:23.370703 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.371669 kubelet[2904]: E0516 16:42:23.370716 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.371669 kubelet[2904]: E0516 16:42:23.371565 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.371669 kubelet[2904]: W0516 16:42:23.371573 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.371669 kubelet[2904]: E0516 16:42:23.371646 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.372138 kubelet[2904]: I0516 16:42:23.371980 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e2f3a294-0f1d-4058-aea4-a5b3b7a443f1-socket-dir\") pod \"csi-node-driver-czjpr\" (UID: \"e2f3a294-0f1d-4058-aea4-a5b3b7a443f1\") " pod="calico-system/csi-node-driver-czjpr" May 16 16:42:23.372138 kubelet[2904]: E0516 16:42:23.372102 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.372138 kubelet[2904]: W0516 16:42:23.372112 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.372315 kubelet[2904]: E0516 16:42:23.372142 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.373761 kubelet[2904]: E0516 16:42:23.372545 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.373761 kubelet[2904]: W0516 16:42:23.372554 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.373761 kubelet[2904]: E0516 16:42:23.372614 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.373985 kubelet[2904]: E0516 16:42:23.373972 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.374013 kubelet[2904]: W0516 16:42:23.373985 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.375378 kubelet[2904]: E0516 16:42:23.373999 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.375378 kubelet[2904]: E0516 16:42:23.375327 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.375378 kubelet[2904]: W0516 16:42:23.375338 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.375378 kubelet[2904]: E0516 16:42:23.375352 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.393757 containerd[1610]: time="2025-05-16T16:42:23.393398882Z" level=info msg="connecting to shim a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce" address="unix:///run/containerd/s/ce39f98048d7740492ab910116f08c53955463e9dcdc37b3477d6dc2ee7a8cff" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:23.412872 systemd[1]: Started cri-containerd-a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce.scope - libcontainer container a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce. May 16 16:42:23.446926 containerd[1610]: time="2025-05-16T16:42:23.446788448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-njv75,Uid:ff045956-f684-4691-9569-cf922b9398d9,Namespace:calico-system,Attempt:0,} returns sandbox id \"a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce\"" May 16 16:42:23.472525 kubelet[2904]: E0516 16:42:23.472501 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.472525 kubelet[2904]: W0516 16:42:23.472518 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.472674 kubelet[2904]: E0516 16:42:23.472543 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.472674 kubelet[2904]: E0516 16:42:23.472644 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.472674 kubelet[2904]: W0516 16:42:23.472649 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.472674 kubelet[2904]: E0516 16:42:23.472663 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.472782 kubelet[2904]: E0516 16:42:23.472769 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.472782 kubelet[2904]: W0516 16:42:23.472774 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.472817 kubelet[2904]: E0516 16:42:23.472785 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.472875 kubelet[2904]: E0516 16:42:23.472863 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.472962 kubelet[2904]: W0516 16:42:23.472886 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.472962 kubelet[2904]: E0516 16:42:23.472897 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473144 kubelet[2904]: E0516 16:42:23.473004 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473144 kubelet[2904]: W0516 16:42:23.473011 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473144 kubelet[2904]: E0516 16:42:23.473022 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473144 kubelet[2904]: E0516 16:42:23.473107 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473144 kubelet[2904]: W0516 16:42:23.473112 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473144 kubelet[2904]: E0516 16:42:23.473119 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473304 kubelet[2904]: E0516 16:42:23.473293 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473304 kubelet[2904]: W0516 16:42:23.473301 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473345 kubelet[2904]: E0516 16:42:23.473308 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473399 kubelet[2904]: E0516 16:42:23.473386 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473399 kubelet[2904]: W0516 16:42:23.473395 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473456 kubelet[2904]: E0516 16:42:23.473400 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473480 kubelet[2904]: E0516 16:42:23.473468 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473480 kubelet[2904]: W0516 16:42:23.473473 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473480 kubelet[2904]: E0516 16:42:23.473480 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473614 kubelet[2904]: E0516 16:42:23.473603 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473614 kubelet[2904]: W0516 16:42:23.473611 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473665 kubelet[2904]: E0516 16:42:23.473616 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473745 kubelet[2904]: E0516 16:42:23.473734 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473745 kubelet[2904]: W0516 16:42:23.473741 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473814 kubelet[2904]: E0516 16:42:23.473748 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473836 kubelet[2904]: E0516 16:42:23.473822 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473836 kubelet[2904]: W0516 16:42:23.473827 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473836 kubelet[2904]: E0516 16:42:23.473831 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.473913 kubelet[2904]: E0516 16:42:23.473905 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.473913 kubelet[2904]: W0516 16:42:23.473912 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.473949 kubelet[2904]: E0516 16:42:23.473924 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474018 kubelet[2904]: E0516 16:42:23.474010 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.474018 kubelet[2904]: W0516 16:42:23.474017 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.474057 kubelet[2904]: E0516 16:42:23.474025 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474153 kubelet[2904]: E0516 16:42:23.474145 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.474153 kubelet[2904]: W0516 16:42:23.474151 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.474220 kubelet[2904]: E0516 16:42:23.474209 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474262 kubelet[2904]: E0516 16:42:23.474252 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.474262 kubelet[2904]: W0516 16:42:23.474259 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.474399 kubelet[2904]: E0516 16:42:23.474272 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474399 kubelet[2904]: E0516 16:42:23.474360 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.474399 kubelet[2904]: W0516 16:42:23.474365 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.474399 kubelet[2904]: E0516 16:42:23.474378 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474504 kubelet[2904]: E0516 16:42:23.474465 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.474504 kubelet[2904]: W0516 16:42:23.474470 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.474504 kubelet[2904]: E0516 16:42:23.474477 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474592 kubelet[2904]: E0516 16:42:23.474566 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.474592 kubelet[2904]: W0516 16:42:23.474570 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.474592 kubelet[2904]: E0516 16:42:23.474575 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474659 kubelet[2904]: E0516 16:42:23.474650 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.474659 kubelet[2904]: W0516 16:42:23.474657 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.474742 kubelet[2904]: E0516 16:42:23.474704 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474825 kubelet[2904]: E0516 16:42:23.474814 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.474825 kubelet[2904]: W0516 16:42:23.474821 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.474930 kubelet[2904]: E0516 16:42:23.474829 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.474975 kubelet[2904]: E0516 16:42:23.474969 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.475010 kubelet[2904]: W0516 16:42:23.475005 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.475046 kubelet[2904]: E0516 16:42:23.475040 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.475332 kubelet[2904]: E0516 16:42:23.475324 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.475332 kubelet[2904]: W0516 16:42:23.475329 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.475402 kubelet[2904]: E0516 16:42:23.475348 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.475457 kubelet[2904]: E0516 16:42:23.475446 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.475457 kubelet[2904]: W0516 16:42:23.475452 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.475457 kubelet[2904]: E0516 16:42:23.475456 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.475599 kubelet[2904]: E0516 16:42:23.475545 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.475599 kubelet[2904]: W0516 16:42:23.475549 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.475599 kubelet[2904]: E0516 16:42:23.475554 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:23.485058 kubelet[2904]: E0516 16:42:23.485039 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:23.485134 kubelet[2904]: W0516 16:42:23.485105 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:23.485134 kubelet[2904]: E0516 16:42:23.485121 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:24.722265 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2242351421.mount: Deactivated successfully. May 16 16:42:25.546958 containerd[1610]: time="2025-05-16T16:42:25.546928201Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:25.547951 containerd[1610]: time="2025-05-16T16:42:25.547925787Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 16 16:42:25.548204 containerd[1610]: time="2025-05-16T16:42:25.548182993Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:25.549739 containerd[1610]: time="2025-05-16T16:42:25.549541607Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:25.550177 containerd[1610]: time="2025-05-16T16:42:25.550163995Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 2.469840001s" May 16 16:42:25.550238 containerd[1610]: time="2025-05-16T16:42:25.550230219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 16 16:42:25.551721 containerd[1610]: time="2025-05-16T16:42:25.551678396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 16 16:42:25.559812 containerd[1610]: time="2025-05-16T16:42:25.559446895Z" level=info msg="CreateContainer within sandbox \"e29da737d922772305eaaf97a4ce80ad8d6e69853f068ebbb3e8a1267a344e35\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 16 16:42:25.565309 containerd[1610]: time="2025-05-16T16:42:25.565283013Z" level=info msg="Container b4bee78d7aa2a4048bb4133d3a75f62ff8c6a1781aab16eb3bbbb5af73a70a8e: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:25.574060 containerd[1610]: time="2025-05-16T16:42:25.574029881Z" level=info msg="CreateContainer within sandbox \"e29da737d922772305eaaf97a4ce80ad8d6e69853f068ebbb3e8a1267a344e35\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b4bee78d7aa2a4048bb4133d3a75f62ff8c6a1781aab16eb3bbbb5af73a70a8e\"" May 16 16:42:25.575412 containerd[1610]: time="2025-05-16T16:42:25.575388994Z" level=info msg="StartContainer for \"b4bee78d7aa2a4048bb4133d3a75f62ff8c6a1781aab16eb3bbbb5af73a70a8e\"" May 16 16:42:25.576149 containerd[1610]: time="2025-05-16T16:42:25.576133299Z" level=info msg="connecting to shim b4bee78d7aa2a4048bb4133d3a75f62ff8c6a1781aab16eb3bbbb5af73a70a8e" address="unix:///run/containerd/s/93f081e7242521a68cd5b57295af6a65a7bc6e16227b9e5a4752c3787da22c00" protocol=ttrpc version=3 May 16 16:42:25.579569 kubelet[2904]: E0516 16:42:25.577208 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czjpr" podUID="e2f3a294-0f1d-4058-aea4-a5b3b7a443f1" May 16 16:42:25.600184 systemd[1]: Started cri-containerd-b4bee78d7aa2a4048bb4133d3a75f62ff8c6a1781aab16eb3bbbb5af73a70a8e.scope - libcontainer container b4bee78d7aa2a4048bb4133d3a75f62ff8c6a1781aab16eb3bbbb5af73a70a8e. May 16 16:42:25.663598 containerd[1610]: time="2025-05-16T16:42:25.663410788Z" level=info msg="StartContainer for \"b4bee78d7aa2a4048bb4133d3a75f62ff8c6a1781aab16eb3bbbb5af73a70a8e\" returns successfully" May 16 16:42:26.677348 kubelet[2904]: I0516 16:42:26.677288 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-859ddc4fd9-n8z7f" podStartSLOduration=2.206544907 podStartE2EDuration="4.677276202s" podCreationTimestamp="2025-05-16 16:42:22 +0000 UTC" firstStartedPulling="2025-05-16 16:42:23.080074489 +0000 UTC m=+15.630326014" lastFinishedPulling="2025-05-16 16:42:25.550805784 +0000 UTC m=+18.101057309" observedRunningTime="2025-05-16 16:42:26.676554074 +0000 UTC m=+19.226805609" watchObservedRunningTime="2025-05-16 16:42:26.677276202 +0000 UTC m=+19.227527737" May 16 16:42:26.679990 kubelet[2904]: E0516 16:42:26.679937 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.679990 kubelet[2904]: W0516 16:42:26.679950 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.679990 kubelet[2904]: E0516 16:42:26.679962 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.680232 kubelet[2904]: E0516 16:42:26.680189 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.680232 kubelet[2904]: W0516 16:42:26.680195 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.680232 kubelet[2904]: E0516 16:42:26.680200 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.680378 kubelet[2904]: E0516 16:42:26.680338 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.680378 kubelet[2904]: W0516 16:42:26.680344 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.680378 kubelet[2904]: E0516 16:42:26.680349 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.680520 kubelet[2904]: E0516 16:42:26.680495 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.680520 kubelet[2904]: W0516 16:42:26.680501 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.680520 kubelet[2904]: E0516 16:42:26.680506 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684348 kubelet[2904]: E0516 16:42:26.680683 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684348 kubelet[2904]: W0516 16:42:26.680697 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684348 kubelet[2904]: E0516 16:42:26.680702 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684348 kubelet[2904]: E0516 16:42:26.680790 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684348 kubelet[2904]: W0516 16:42:26.680795 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684348 kubelet[2904]: E0516 16:42:26.680800 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684348 kubelet[2904]: E0516 16:42:26.680876 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684348 kubelet[2904]: W0516 16:42:26.680881 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684348 kubelet[2904]: E0516 16:42:26.680885 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684348 kubelet[2904]: E0516 16:42:26.680957 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684577 kubelet[2904]: W0516 16:42:26.680962 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684577 kubelet[2904]: E0516 16:42:26.680974 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684577 kubelet[2904]: E0516 16:42:26.681052 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684577 kubelet[2904]: W0516 16:42:26.681057 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684577 kubelet[2904]: E0516 16:42:26.681062 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684577 kubelet[2904]: E0516 16:42:26.681136 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684577 kubelet[2904]: W0516 16:42:26.681141 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684577 kubelet[2904]: E0516 16:42:26.681145 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684577 kubelet[2904]: E0516 16:42:26.681218 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684577 kubelet[2904]: W0516 16:42:26.681222 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684756 kubelet[2904]: E0516 16:42:26.681227 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684756 kubelet[2904]: E0516 16:42:26.681302 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684756 kubelet[2904]: W0516 16:42:26.681309 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684756 kubelet[2904]: E0516 16:42:26.681314 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684756 kubelet[2904]: E0516 16:42:26.681389 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684756 kubelet[2904]: W0516 16:42:26.681393 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684756 kubelet[2904]: E0516 16:42:26.681397 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684756 kubelet[2904]: E0516 16:42:26.681468 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684756 kubelet[2904]: W0516 16:42:26.681473 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684756 kubelet[2904]: E0516 16:42:26.681477 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.684923 kubelet[2904]: E0516 16:42:26.681550 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.684923 kubelet[2904]: W0516 16:42:26.681554 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.684923 kubelet[2904]: E0516 16:42:26.681558 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.703150 kubelet[2904]: E0516 16:42:26.703125 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.703150 kubelet[2904]: W0516 16:42:26.703143 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.703150 kubelet[2904]: E0516 16:42:26.703160 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.703896 kubelet[2904]: E0516 16:42:26.703881 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.703896 kubelet[2904]: W0516 16:42:26.703891 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.703968 kubelet[2904]: E0516 16:42:26.703905 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.704116 kubelet[2904]: E0516 16:42:26.704018 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.704116 kubelet[2904]: W0516 16:42:26.704027 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.704116 kubelet[2904]: E0516 16:42:26.704035 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.704237 kubelet[2904]: E0516 16:42:26.704228 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.704237 kubelet[2904]: W0516 16:42:26.704236 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.704489 kubelet[2904]: E0516 16:42:26.704253 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.704489 kubelet[2904]: E0516 16:42:26.704364 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.704489 kubelet[2904]: W0516 16:42:26.704371 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.704489 kubelet[2904]: E0516 16:42:26.704387 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.704489 kubelet[2904]: E0516 16:42:26.704487 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.704594 kubelet[2904]: W0516 16:42:26.704493 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.704594 kubelet[2904]: E0516 16:42:26.704509 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.710262 kubelet[2904]: E0516 16:42:26.710243 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.710262 kubelet[2904]: W0516 16:42:26.710258 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.710440 kubelet[2904]: E0516 16:42:26.710287 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.710440 kubelet[2904]: E0516 16:42:26.710426 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.710440 kubelet[2904]: W0516 16:42:26.710431 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.710440 kubelet[2904]: E0516 16:42:26.710437 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.711792 kubelet[2904]: E0516 16:42:26.711776 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.711792 kubelet[2904]: W0516 16:42:26.711790 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.711852 kubelet[2904]: E0516 16:42:26.711802 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.712346 kubelet[2904]: E0516 16:42:26.712124 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.712346 kubelet[2904]: W0516 16:42:26.712133 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.712346 kubelet[2904]: E0516 16:42:26.712142 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.712692 kubelet[2904]: E0516 16:42:26.712638 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.712692 kubelet[2904]: W0516 16:42:26.712654 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.719466 kubelet[2904]: E0516 16:42:26.712788 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.719466 kubelet[2904]: E0516 16:42:26.713136 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.719466 kubelet[2904]: W0516 16:42:26.713144 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.719466 kubelet[2904]: E0516 16:42:26.713165 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.719466 kubelet[2904]: E0516 16:42:26.713273 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.719466 kubelet[2904]: W0516 16:42:26.713280 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.719466 kubelet[2904]: E0516 16:42:26.713297 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.719466 kubelet[2904]: E0516 16:42:26.713394 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.719466 kubelet[2904]: W0516 16:42:26.713401 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.719466 kubelet[2904]: E0516 16:42:26.713434 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.719630 kubelet[2904]: E0516 16:42:26.713642 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.719630 kubelet[2904]: W0516 16:42:26.713662 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.719630 kubelet[2904]: E0516 16:42:26.713671 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.719630 kubelet[2904]: E0516 16:42:26.713876 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.719630 kubelet[2904]: W0516 16:42:26.713881 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.719630 kubelet[2904]: E0516 16:42:26.713887 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.719630 kubelet[2904]: E0516 16:42:26.714130 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.719630 kubelet[2904]: W0516 16:42:26.714240 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.719630 kubelet[2904]: E0516 16:42:26.714250 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.719630 kubelet[2904]: E0516 16:42:26.714621 2904 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:26.719800 kubelet[2904]: W0516 16:42:26.714627 2904 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:26.719800 kubelet[2904]: E0516 16:42:26.714633 2904 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:26.974828 containerd[1610]: time="2025-05-16T16:42:26.974504243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:26.975059 containerd[1610]: time="2025-05-16T16:42:26.974956221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 16 16:42:26.977192 containerd[1610]: time="2025-05-16T16:42:26.977172090Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:26.978795 containerd[1610]: time="2025-05-16T16:42:26.978771598Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:26.979240 containerd[1610]: time="2025-05-16T16:42:26.979219730Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 1.427518712s" May 16 16:42:26.979283 containerd[1610]: time="2025-05-16T16:42:26.979243608Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 16 16:42:26.982190 containerd[1610]: time="2025-05-16T16:42:26.982164072Z" level=info msg="CreateContainer within sandbox \"a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 16 16:42:27.157960 containerd[1610]: time="2025-05-16T16:42:27.157717055Z" level=info msg="Container 24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:27.193422 containerd[1610]: time="2025-05-16T16:42:27.193392056Z" level=info msg="CreateContainer within sandbox \"a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520\"" May 16 16:42:27.193868 containerd[1610]: time="2025-05-16T16:42:27.193850870Z" level=info msg="StartContainer for \"24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520\"" May 16 16:42:27.195249 containerd[1610]: time="2025-05-16T16:42:27.195227332Z" level=info msg="connecting to shim 24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520" address="unix:///run/containerd/s/ce39f98048d7740492ab910116f08c53955463e9dcdc37b3477d6dc2ee7a8cff" protocol=ttrpc version=3 May 16 16:42:27.212897 systemd[1]: Started cri-containerd-24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520.scope - libcontainer container 24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520. May 16 16:42:27.242628 containerd[1610]: time="2025-05-16T16:42:27.242370965Z" level=info msg="StartContainer for \"24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520\" returns successfully" May 16 16:42:27.248987 systemd[1]: cri-containerd-24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520.scope: Deactivated successfully. May 16 16:42:27.262793 containerd[1610]: time="2025-05-16T16:42:27.262127005Z" level=info msg="TaskExit event in podsandbox handler container_id:\"24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520\" id:\"24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520\" pid:3583 exited_at:{seconds:1747413747 nanos:250494744}" May 16 16:42:27.268633 containerd[1610]: time="2025-05-16T16:42:27.268606117Z" level=info msg="received exit event container_id:\"24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520\" id:\"24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520\" pid:3583 exited_at:{seconds:1747413747 nanos:250494744}" May 16 16:42:27.286495 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-24b1051db2d5ecd11814f2480b6548c6d90eab36a7fbfc4e3847137af8362520-rootfs.mount: Deactivated successfully. May 16 16:42:27.578047 kubelet[2904]: E0516 16:42:27.577981 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czjpr" podUID="e2f3a294-0f1d-4058-aea4-a5b3b7a443f1" May 16 16:42:28.678433 containerd[1610]: time="2025-05-16T16:42:28.678406542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 16 16:42:29.578350 kubelet[2904]: E0516 16:42:29.577933 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czjpr" podUID="e2f3a294-0f1d-4058-aea4-a5b3b7a443f1" May 16 16:42:31.578570 kubelet[2904]: E0516 16:42:31.578163 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czjpr" podUID="e2f3a294-0f1d-4058-aea4-a5b3b7a443f1" May 16 16:42:32.397042 containerd[1610]: time="2025-05-16T16:42:32.397014536Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:32.397876 containerd[1610]: time="2025-05-16T16:42:32.397859506Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 16 16:42:32.399388 containerd[1610]: time="2025-05-16T16:42:32.399357754Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:32.400483 containerd[1610]: time="2025-05-16T16:42:32.400452159Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:32.401028 containerd[1610]: time="2025-05-16T16:42:32.400929947Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 3.722499054s" May 16 16:42:32.401028 containerd[1610]: time="2025-05-16T16:42:32.400968210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 16 16:42:32.403362 containerd[1610]: time="2025-05-16T16:42:32.403322844Z" level=info msg="CreateContainer within sandbox \"a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 16 16:42:32.444535 containerd[1610]: time="2025-05-16T16:42:32.444490532Z" level=info msg="Container bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:32.449698 containerd[1610]: time="2025-05-16T16:42:32.449639743Z" level=info msg="CreateContainer within sandbox \"a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51\"" May 16 16:42:32.450201 containerd[1610]: time="2025-05-16T16:42:32.450064218Z" level=info msg="StartContainer for \"bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51\"" May 16 16:42:32.451901 containerd[1610]: time="2025-05-16T16:42:32.451871575Z" level=info msg="connecting to shim bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51" address="unix:///run/containerd/s/ce39f98048d7740492ab910116f08c53955463e9dcdc37b3477d6dc2ee7a8cff" protocol=ttrpc version=3 May 16 16:42:32.470824 systemd[1]: Started cri-containerd-bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51.scope - libcontainer container bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51. May 16 16:42:32.513413 containerd[1610]: time="2025-05-16T16:42:32.513385542Z" level=info msg="StartContainer for \"bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51\" returns successfully" May 16 16:42:33.659844 kubelet[2904]: E0516 16:42:33.659795 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-czjpr" podUID="e2f3a294-0f1d-4058-aea4-a5b3b7a443f1" May 16 16:42:34.051651 systemd[1]: cri-containerd-bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51.scope: Deactivated successfully. May 16 16:42:34.052100 systemd[1]: cri-containerd-bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51.scope: Consumed 310ms CPU time, 164.7M memory peak, 1004K read from disk, 170.9M written to disk. May 16 16:42:34.118753 kubelet[2904]: I0516 16:42:34.118490 2904 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 16 16:42:34.121662 containerd[1610]: time="2025-05-16T16:42:34.121596748Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51\" id:\"bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51\" pid:3639 exited_at:{seconds:1747413754 nanos:121253787}" May 16 16:42:34.122041 containerd[1610]: time="2025-05-16T16:42:34.121653200Z" level=info msg="received exit event container_id:\"bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51\" id:\"bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51\" pid:3639 exited_at:{seconds:1747413754 nanos:121253787}" May 16 16:42:34.148837 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bb804dc639fe9140e929df23d74a5282d176c963a322078cb52cb85ff4e5dd51-rootfs.mount: Deactivated successfully. May 16 16:42:34.173871 systemd[1]: Created slice kubepods-burstable-podf5332b41_bb78_4eb1_8393_01391eebcbd1.slice - libcontainer container kubepods-burstable-podf5332b41_bb78_4eb1_8393_01391eebcbd1.slice. May 16 16:42:34.183400 systemd[1]: Created slice kubepods-burstable-podbcfb93e8_c4e5_4104_8e91_9ce9dd0e68cd.slice - libcontainer container kubepods-burstable-podbcfb93e8_c4e5_4104_8e91_9ce9dd0e68cd.slice. May 16 16:42:34.188622 systemd[1]: Created slice kubepods-besteffort-pod03784e1b_ebb0_4a4e_8922_7c6e649932aa.slice - libcontainer container kubepods-besteffort-pod03784e1b_ebb0_4a4e_8922_7c6e649932aa.slice. May 16 16:42:34.195191 systemd[1]: Created slice kubepods-besteffort-pod5816b0c4_629d_48da_91c2_b008ec0d38de.slice - libcontainer container kubepods-besteffort-pod5816b0c4_629d_48da_91c2_b008ec0d38de.slice. May 16 16:42:34.202419 systemd[1]: Created slice kubepods-besteffort-podc1703aa4_4ec2_4b1a_8bb3_ef63b2681873.slice - libcontainer container kubepods-besteffort-podc1703aa4_4ec2_4b1a_8bb3_ef63b2681873.slice. May 16 16:42:34.209205 systemd[1]: Created slice kubepods-besteffort-podf47c5b55_7da4_4cde_9213_a21e48a7736b.slice - libcontainer container kubepods-besteffort-podf47c5b55_7da4_4cde_9213_a21e48a7736b.slice. May 16 16:42:34.216335 systemd[1]: Created slice kubepods-besteffort-pod1378257a_134f_443c_9608_a7936ba8cf76.slice - libcontainer container kubepods-besteffort-pod1378257a_134f_443c_9608_a7936ba8cf76.slice. May 16 16:42:34.220841 systemd[1]: Created slice kubepods-besteffort-podba5965bb_e11f_490d_8dee_db0aaabf3b26.slice - libcontainer container kubepods-besteffort-podba5965bb_e11f_490d_8dee_db0aaabf3b26.slice. May 16 16:42:34.265842 kubelet[2904]: I0516 16:42:34.265585 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b42cr\" (UniqueName: \"kubernetes.io/projected/03784e1b-ebb0-4a4e-8922-7c6e649932aa-kube-api-access-b42cr\") pod \"calico-apiserver-bd5dbfdd-mxd6d\" (UID: \"03784e1b-ebb0-4a4e-8922-7c6e649932aa\") " pod="calico-apiserver/calico-apiserver-bd5dbfdd-mxd6d" May 16 16:42:34.265842 kubelet[2904]: I0516 16:42:34.265629 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-266z9\" (UniqueName: \"kubernetes.io/projected/f5332b41-bb78-4eb1-8393-01391eebcbd1-kube-api-access-266z9\") pod \"coredns-7c65d6cfc9-5kps5\" (UID: \"f5332b41-bb78-4eb1-8393-01391eebcbd1\") " pod="kube-system/coredns-7c65d6cfc9-5kps5" May 16 16:42:34.265842 kubelet[2904]: I0516 16:42:34.265644 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5332b41-bb78-4eb1-8393-01391eebcbd1-config-volume\") pod \"coredns-7c65d6cfc9-5kps5\" (UID: \"f5332b41-bb78-4eb1-8393-01391eebcbd1\") " pod="kube-system/coredns-7c65d6cfc9-5kps5" May 16 16:42:34.265842 kubelet[2904]: I0516 16:42:34.265656 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfp9\" (UniqueName: \"kubernetes.io/projected/bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd-kube-api-access-4rfp9\") pod \"coredns-7c65d6cfc9-mckgq\" (UID: \"bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd\") " pod="kube-system/coredns-7c65d6cfc9-mckgq" May 16 16:42:34.265842 kubelet[2904]: I0516 16:42:34.265667 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03784e1b-ebb0-4a4e-8922-7c6e649932aa-calico-apiserver-certs\") pod \"calico-apiserver-bd5dbfdd-mxd6d\" (UID: \"03784e1b-ebb0-4a4e-8922-7c6e649932aa\") " pod="calico-apiserver/calico-apiserver-bd5dbfdd-mxd6d" May 16 16:42:34.266022 kubelet[2904]: I0516 16:42:34.265679 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd-config-volume\") pod \"coredns-7c65d6cfc9-mckgq\" (UID: \"bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd\") " pod="kube-system/coredns-7c65d6cfc9-mckgq" May 16 16:42:34.366776 kubelet[2904]: I0516 16:42:34.366301 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rgv\" (UniqueName: \"kubernetes.io/projected/ba5965bb-e11f-490d-8dee-db0aaabf3b26-kube-api-access-t9rgv\") pod \"calico-apiserver-567dc9f47-slfcz\" (UID: \"ba5965bb-e11f-490d-8dee-db0aaabf3b26\") " pod="calico-apiserver/calico-apiserver-567dc9f47-slfcz" May 16 16:42:34.366776 kubelet[2904]: I0516 16:42:34.366331 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbjz\" (UniqueName: \"kubernetes.io/projected/5816b0c4-629d-48da-91c2-b008ec0d38de-kube-api-access-9bbjz\") pod \"calico-kube-controllers-68574cbdb7-zks72\" (UID: \"5816b0c4-629d-48da-91c2-b008ec0d38de\") " pod="calico-system/calico-kube-controllers-68574cbdb7-zks72" May 16 16:42:34.366776 kubelet[2904]: I0516 16:42:34.366345 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhtvc\" (UniqueName: \"kubernetes.io/projected/1378257a-134f-443c-9608-a7936ba8cf76-kube-api-access-jhtvc\") pod \"calico-apiserver-bd5dbfdd-7fnf8\" (UID: \"1378257a-134f-443c-9608-a7936ba8cf76\") " pod="calico-apiserver/calico-apiserver-bd5dbfdd-7fnf8" May 16 16:42:34.366776 kubelet[2904]: I0516 16:42:34.366361 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ba5965bb-e11f-490d-8dee-db0aaabf3b26-calico-apiserver-certs\") pod \"calico-apiserver-567dc9f47-slfcz\" (UID: \"ba5965bb-e11f-490d-8dee-db0aaabf3b26\") " pod="calico-apiserver/calico-apiserver-567dc9f47-slfcz" May 16 16:42:34.366776 kubelet[2904]: I0516 16:42:34.366373 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f47c5b55-7da4-4cde-9213-a21e48a7736b-goldmane-key-pair\") pod \"goldmane-8f77d7b6c-7x2n4\" (UID: \"f47c5b55-7da4-4cde-9213-a21e48a7736b\") " pod="calico-system/goldmane-8f77d7b6c-7x2n4" May 16 16:42:34.373657 kubelet[2904]: I0516 16:42:34.366390 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-whisker-backend-key-pair\") pod \"whisker-7f4cbf4bcf-k2kzq\" (UID: \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\") " pod="calico-system/whisker-7f4cbf4bcf-k2kzq" May 16 16:42:34.373657 kubelet[2904]: I0516 16:42:34.366399 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-whisker-ca-bundle\") pod \"whisker-7f4cbf4bcf-k2kzq\" (UID: \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\") " pod="calico-system/whisker-7f4cbf4bcf-k2kzq" May 16 16:42:34.373657 kubelet[2904]: I0516 16:42:34.366415 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f47c5b55-7da4-4cde-9213-a21e48a7736b-config\") pod \"goldmane-8f77d7b6c-7x2n4\" (UID: \"f47c5b55-7da4-4cde-9213-a21e48a7736b\") " pod="calico-system/goldmane-8f77d7b6c-7x2n4" May 16 16:42:34.373657 kubelet[2904]: I0516 16:42:34.366439 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5816b0c4-629d-48da-91c2-b008ec0d38de-tigera-ca-bundle\") pod \"calico-kube-controllers-68574cbdb7-zks72\" (UID: \"5816b0c4-629d-48da-91c2-b008ec0d38de\") " pod="calico-system/calico-kube-controllers-68574cbdb7-zks72" May 16 16:42:34.373657 kubelet[2904]: I0516 16:42:34.366448 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f47c5b55-7da4-4cde-9213-a21e48a7736b-goldmane-ca-bundle\") pod \"goldmane-8f77d7b6c-7x2n4\" (UID: \"f47c5b55-7da4-4cde-9213-a21e48a7736b\") " pod="calico-system/goldmane-8f77d7b6c-7x2n4" May 16 16:42:34.374046 kubelet[2904]: I0516 16:42:34.366458 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jks2l\" (UniqueName: \"kubernetes.io/projected/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-kube-api-access-jks2l\") pod \"whisker-7f4cbf4bcf-k2kzq\" (UID: \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\") " pod="calico-system/whisker-7f4cbf4bcf-k2kzq" May 16 16:42:34.374046 kubelet[2904]: I0516 16:42:34.366474 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46x4f\" (UniqueName: \"kubernetes.io/projected/f47c5b55-7da4-4cde-9213-a21e48a7736b-kube-api-access-46x4f\") pod \"goldmane-8f77d7b6c-7x2n4\" (UID: \"f47c5b55-7da4-4cde-9213-a21e48a7736b\") " pod="calico-system/goldmane-8f77d7b6c-7x2n4" May 16 16:42:34.374046 kubelet[2904]: I0516 16:42:34.366485 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1378257a-134f-443c-9608-a7936ba8cf76-calico-apiserver-certs\") pod \"calico-apiserver-bd5dbfdd-7fnf8\" (UID: \"1378257a-134f-443c-9608-a7936ba8cf76\") " pod="calico-apiserver/calico-apiserver-bd5dbfdd-7fnf8" May 16 16:42:34.489258 containerd[1610]: time="2025-05-16T16:42:34.489222190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mckgq,Uid:bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd,Namespace:kube-system,Attempt:0,}" May 16 16:42:34.492398 containerd[1610]: time="2025-05-16T16:42:34.492352390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5kps5,Uid:f5332b41-bb78-4eb1-8393-01391eebcbd1,Namespace:kube-system,Attempt:0,}" May 16 16:42:34.493931 containerd[1610]: time="2025-05-16T16:42:34.493898450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bd5dbfdd-mxd6d,Uid:03784e1b-ebb0-4a4e-8922-7c6e649932aa,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:34.500542 containerd[1610]: time="2025-05-16T16:42:34.500515385Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68574cbdb7-zks72,Uid:5816b0c4-629d-48da-91c2-b008ec0d38de,Namespace:calico-system,Attempt:0,}" May 16 16:42:34.520672 containerd[1610]: time="2025-05-16T16:42:34.520622373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-7x2n4,Uid:f47c5b55-7da4-4cde-9213-a21e48a7736b,Namespace:calico-system,Attempt:0,}" May 16 16:42:34.523597 containerd[1610]: time="2025-05-16T16:42:34.523580126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f4cbf4bcf-k2kzq,Uid:c1703aa4-4ec2-4b1a-8bb3-ef63b2681873,Namespace:calico-system,Attempt:0,}" May 16 16:42:34.530114 containerd[1610]: time="2025-05-16T16:42:34.530092933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567dc9f47-slfcz,Uid:ba5965bb-e11f-490d-8dee-db0aaabf3b26,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:34.534005 containerd[1610]: time="2025-05-16T16:42:34.533983141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bd5dbfdd-7fnf8,Uid:1378257a-134f-443c-9608-a7936ba8cf76,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:34.738928 containerd[1610]: time="2025-05-16T16:42:34.738715580Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 16 16:42:34.892426 containerd[1610]: time="2025-05-16T16:42:34.892345687Z" level=error msg="Failed to destroy network for sandbox \"93a51a5328dc5cab2d19c45d5b6eba066c27764f59e46c67f5e64128869d4bac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.892953 containerd[1610]: time="2025-05-16T16:42:34.892906228Z" level=error msg="Failed to destroy network for sandbox \"9c92dd3a8a991afecd7b6f818fc0752e19ea33803d09d2e9318d74a2422fa8a8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.893035 containerd[1610]: time="2025-05-16T16:42:34.893016941Z" level=error msg="Failed to destroy network for sandbox \"c72bbc81feebcadfaf8af3f95da721207879d4fcf3215f39a51ce2f8bf7ec372\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.899125 containerd[1610]: time="2025-05-16T16:42:34.893224293Z" level=error msg="Failed to destroy network for sandbox \"4988ba9a5993f13786b0d5d20b6f371be634903d74857335c27ce69065c3d10e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.899125 containerd[1610]: time="2025-05-16T16:42:34.894410882Z" level=error msg="Failed to destroy network for sandbox \"87af3c8ac74b962b15601e21c2d65305bce69c3ece308bd4650987f731a252fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.899125 containerd[1610]: time="2025-05-16T16:42:34.895494801Z" level=error msg="Failed to destroy network for sandbox \"c4110a5ac6c1d2a5efe85b487ad24576c8aca8f04e3d2b521950ee8110f01fcd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.899125 containerd[1610]: time="2025-05-16T16:42:34.896073633Z" level=error msg="Failed to destroy network for sandbox \"72d1508e12280a32d402247711f8251f750037d520f628512cc86d4e93f471fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.899125 containerd[1610]: time="2025-05-16T16:42:34.897686995Z" level=error msg="Failed to destroy network for sandbox \"a784148bf25a0259fc8339aef63a2727a005a381c43364cdc73a05905a5a3b99\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.903481 containerd[1610]: time="2025-05-16T16:42:34.899617290Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567dc9f47-slfcz,Uid:ba5965bb-e11f-490d-8dee-db0aaabf3b26,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"93a51a5328dc5cab2d19c45d5b6eba066c27764f59e46c67f5e64128869d4bac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.904587 containerd[1610]: time="2025-05-16T16:42:34.904566016Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5kps5,Uid:f5332b41-bb78-4eb1-8393-01391eebcbd1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c92dd3a8a991afecd7b6f818fc0752e19ea33803d09d2e9318d74a2422fa8a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.906990 kubelet[2904]: E0516 16:42:34.906958 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c92dd3a8a991afecd7b6f818fc0752e19ea33803d09d2e9318d74a2422fa8a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.907179 kubelet[2904]: E0516 16:42:34.907017 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c92dd3a8a991afecd7b6f818fc0752e19ea33803d09d2e9318d74a2422fa8a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-5kps5" May 16 16:42:34.907179 kubelet[2904]: E0516 16:42:34.907032 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c92dd3a8a991afecd7b6f818fc0752e19ea33803d09d2e9318d74a2422fa8a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-5kps5" May 16 16:42:34.907179 kubelet[2904]: E0516 16:42:34.907058 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-5kps5_kube-system(f5332b41-bb78-4eb1-8393-01391eebcbd1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-5kps5_kube-system(f5332b41-bb78-4eb1-8393-01391eebcbd1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c92dd3a8a991afecd7b6f818fc0752e19ea33803d09d2e9318d74a2422fa8a8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-5kps5" podUID="f5332b41-bb78-4eb1-8393-01391eebcbd1" May 16 16:42:34.907250 kubelet[2904]: E0516 16:42:34.906957 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93a51a5328dc5cab2d19c45d5b6eba066c27764f59e46c67f5e64128869d4bac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.907250 kubelet[2904]: E0516 16:42:34.907207 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93a51a5328dc5cab2d19c45d5b6eba066c27764f59e46c67f5e64128869d4bac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-567dc9f47-slfcz" May 16 16:42:34.907250 kubelet[2904]: E0516 16:42:34.907229 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93a51a5328dc5cab2d19c45d5b6eba066c27764f59e46c67f5e64128869d4bac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-567dc9f47-slfcz" May 16 16:42:34.907307 kubelet[2904]: E0516 16:42:34.907254 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-567dc9f47-slfcz_calico-apiserver(ba5965bb-e11f-490d-8dee-db0aaabf3b26)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-567dc9f47-slfcz_calico-apiserver(ba5965bb-e11f-490d-8dee-db0aaabf3b26)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"93a51a5328dc5cab2d19c45d5b6eba066c27764f59e46c67f5e64128869d4bac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-567dc9f47-slfcz" podUID="ba5965bb-e11f-490d-8dee-db0aaabf3b26" May 16 16:42:34.909577 containerd[1610]: time="2025-05-16T16:42:34.909545497Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f4cbf4bcf-k2kzq,Uid:c1703aa4-4ec2-4b1a-8bb3-ef63b2681873,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c72bbc81feebcadfaf8af3f95da721207879d4fcf3215f39a51ce2f8bf7ec372\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.909992 kubelet[2904]: E0516 16:42:34.909966 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c72bbc81feebcadfaf8af3f95da721207879d4fcf3215f39a51ce2f8bf7ec372\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.910149 kubelet[2904]: E0516 16:42:34.910084 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c72bbc81feebcadfaf8af3f95da721207879d4fcf3215f39a51ce2f8bf7ec372\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f4cbf4bcf-k2kzq" May 16 16:42:34.910149 kubelet[2904]: E0516 16:42:34.910101 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c72bbc81feebcadfaf8af3f95da721207879d4fcf3215f39a51ce2f8bf7ec372\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f4cbf4bcf-k2kzq" May 16 16:42:34.910226 kubelet[2904]: E0516 16:42:34.910126 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7f4cbf4bcf-k2kzq_calico-system(c1703aa4-4ec2-4b1a-8bb3-ef63b2681873)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7f4cbf4bcf-k2kzq_calico-system(c1703aa4-4ec2-4b1a-8bb3-ef63b2681873)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c72bbc81feebcadfaf8af3f95da721207879d4fcf3215f39a51ce2f8bf7ec372\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7f4cbf4bcf-k2kzq" podUID="c1703aa4-4ec2-4b1a-8bb3-ef63b2681873" May 16 16:42:34.914690 containerd[1610]: time="2025-05-16T16:42:34.914528212Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-7x2n4,Uid:f47c5b55-7da4-4cde-9213-a21e48a7736b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4988ba9a5993f13786b0d5d20b6f371be634903d74857335c27ce69065c3d10e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.914989 kubelet[2904]: E0516 16:42:34.914888 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4988ba9a5993f13786b0d5d20b6f371be634903d74857335c27ce69065c3d10e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.914989 kubelet[2904]: E0516 16:42:34.914918 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4988ba9a5993f13786b0d5d20b6f371be634903d74857335c27ce69065c3d10e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-7x2n4" May 16 16:42:34.914989 kubelet[2904]: E0516 16:42:34.914930 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4988ba9a5993f13786b0d5d20b6f371be634903d74857335c27ce69065c3d10e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-7x2n4" May 16 16:42:34.915071 kubelet[2904]: E0516 16:42:34.914954 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-8f77d7b6c-7x2n4_calico-system(f47c5b55-7da4-4cde-9213-a21e48a7736b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-8f77d7b6c-7x2n4_calico-system(f47c5b55-7da4-4cde-9213-a21e48a7736b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4988ba9a5993f13786b0d5d20b6f371be634903d74857335c27ce69065c3d10e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:42:34.917240 containerd[1610]: time="2025-05-16T16:42:34.917058642Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bd5dbfdd-7fnf8,Uid:1378257a-134f-443c-9608-a7936ba8cf76,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"87af3c8ac74b962b15601e21c2d65305bce69c3ece308bd4650987f731a252fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.917448 kubelet[2904]: E0516 16:42:34.917424 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87af3c8ac74b962b15601e21c2d65305bce69c3ece308bd4650987f731a252fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.917484 kubelet[2904]: E0516 16:42:34.917456 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87af3c8ac74b962b15601e21c2d65305bce69c3ece308bd4650987f731a252fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bd5dbfdd-7fnf8" May 16 16:42:34.917484 kubelet[2904]: E0516 16:42:34.917473 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87af3c8ac74b962b15601e21c2d65305bce69c3ece308bd4650987f731a252fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bd5dbfdd-7fnf8" May 16 16:42:34.917531 kubelet[2904]: E0516 16:42:34.917497 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-bd5dbfdd-7fnf8_calico-apiserver(1378257a-134f-443c-9608-a7936ba8cf76)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-bd5dbfdd-7fnf8_calico-apiserver(1378257a-134f-443c-9608-a7936ba8cf76)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"87af3c8ac74b962b15601e21c2d65305bce69c3ece308bd4650987f731a252fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-bd5dbfdd-7fnf8" podUID="1378257a-134f-443c-9608-a7936ba8cf76" May 16 16:42:34.921861 containerd[1610]: time="2025-05-16T16:42:34.921660372Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68574cbdb7-zks72,Uid:5816b0c4-629d-48da-91c2-b008ec0d38de,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4110a5ac6c1d2a5efe85b487ad24576c8aca8f04e3d2b521950ee8110f01fcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.922107 kubelet[2904]: E0516 16:42:34.922045 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4110a5ac6c1d2a5efe85b487ad24576c8aca8f04e3d2b521950ee8110f01fcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.922107 kubelet[2904]: E0516 16:42:34.922077 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4110a5ac6c1d2a5efe85b487ad24576c8aca8f04e3d2b521950ee8110f01fcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68574cbdb7-zks72" May 16 16:42:34.922107 kubelet[2904]: E0516 16:42:34.922091 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4110a5ac6c1d2a5efe85b487ad24576c8aca8f04e3d2b521950ee8110f01fcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68574cbdb7-zks72" May 16 16:42:34.922184 kubelet[2904]: E0516 16:42:34.922126 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-68574cbdb7-zks72_calico-system(5816b0c4-629d-48da-91c2-b008ec0d38de)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-68574cbdb7-zks72_calico-system(5816b0c4-629d-48da-91c2-b008ec0d38de)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c4110a5ac6c1d2a5efe85b487ad24576c8aca8f04e3d2b521950ee8110f01fcd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68574cbdb7-zks72" podUID="5816b0c4-629d-48da-91c2-b008ec0d38de" May 16 16:42:34.923524 kubelet[2904]: E0516 16:42:34.922598 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72d1508e12280a32d402247711f8251f750037d520f628512cc86d4e93f471fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.923524 kubelet[2904]: E0516 16:42:34.922613 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72d1508e12280a32d402247711f8251f750037d520f628512cc86d4e93f471fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bd5dbfdd-mxd6d" May 16 16:42:34.923524 kubelet[2904]: E0516 16:42:34.922623 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72d1508e12280a32d402247711f8251f750037d520f628512cc86d4e93f471fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-bd5dbfdd-mxd6d" May 16 16:42:34.923590 containerd[1610]: time="2025-05-16T16:42:34.922537095Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bd5dbfdd-mxd6d,Uid:03784e1b-ebb0-4a4e-8922-7c6e649932aa,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72d1508e12280a32d402247711f8251f750037d520f628512cc86d4e93f471fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.923590 containerd[1610]: time="2025-05-16T16:42:34.922799753Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mckgq,Uid:bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a784148bf25a0259fc8339aef63a2727a005a381c43364cdc73a05905a5a3b99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.923659 kubelet[2904]: E0516 16:42:34.922638 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-bd5dbfdd-mxd6d_calico-apiserver(03784e1b-ebb0-4a4e-8922-7c6e649932aa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-bd5dbfdd-mxd6d_calico-apiserver(03784e1b-ebb0-4a4e-8922-7c6e649932aa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72d1508e12280a32d402247711f8251f750037d520f628512cc86d4e93f471fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-bd5dbfdd-mxd6d" podUID="03784e1b-ebb0-4a4e-8922-7c6e649932aa" May 16 16:42:34.923659 kubelet[2904]: E0516 16:42:34.922872 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a784148bf25a0259fc8339aef63a2727a005a381c43364cdc73a05905a5a3b99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:34.923659 kubelet[2904]: E0516 16:42:34.923197 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a784148bf25a0259fc8339aef63a2727a005a381c43364cdc73a05905a5a3b99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-mckgq" May 16 16:42:34.923745 kubelet[2904]: E0516 16:42:34.923209 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a784148bf25a0259fc8339aef63a2727a005a381c43364cdc73a05905a5a3b99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-mckgq" May 16 16:42:34.923745 kubelet[2904]: E0516 16:42:34.923230 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-mckgq_kube-system(bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-mckgq_kube-system(bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a784148bf25a0259fc8339aef63a2727a005a381c43364cdc73a05905a5a3b99\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-mckgq" podUID="bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd" May 16 16:42:35.582421 systemd[1]: Created slice kubepods-besteffort-pode2f3a294_0f1d_4058_aea4_a5b3b7a443f1.slice - libcontainer container kubepods-besteffort-pode2f3a294_0f1d_4058_aea4_a5b3b7a443f1.slice. May 16 16:42:35.584002 containerd[1610]: time="2025-05-16T16:42:35.583976735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-czjpr,Uid:e2f3a294-0f1d-4058-aea4-a5b3b7a443f1,Namespace:calico-system,Attempt:0,}" May 16 16:42:35.619106 containerd[1610]: time="2025-05-16T16:42:35.619066348Z" level=error msg="Failed to destroy network for sandbox \"cf81f020dd5e6ef4671ccb7591c2fdbaac12445901f2b71278a092f097daa7e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:35.620414 systemd[1]: run-netns-cni\x2da0065068\x2de8ea\x2d5fd4\x2dfe6b\x2d5054c7df7114.mount: Deactivated successfully. May 16 16:42:35.621072 containerd[1610]: time="2025-05-16T16:42:35.621045236Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-czjpr,Uid:e2f3a294-0f1d-4058-aea4-a5b3b7a443f1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf81f020dd5e6ef4671ccb7591c2fdbaac12445901f2b71278a092f097daa7e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:35.621256 kubelet[2904]: E0516 16:42:35.621200 2904 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf81f020dd5e6ef4671ccb7591c2fdbaac12445901f2b71278a092f097daa7e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:35.621256 kubelet[2904]: E0516 16:42:35.621248 2904 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf81f020dd5e6ef4671ccb7591c2fdbaac12445901f2b71278a092f097daa7e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-czjpr" May 16 16:42:35.621321 kubelet[2904]: E0516 16:42:35.621265 2904 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf81f020dd5e6ef4671ccb7591c2fdbaac12445901f2b71278a092f097daa7e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-czjpr" May 16 16:42:35.621321 kubelet[2904]: E0516 16:42:35.621297 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-czjpr_calico-system(e2f3a294-0f1d-4058-aea4-a5b3b7a443f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-czjpr_calico-system(e2f3a294-0f1d-4058-aea4-a5b3b7a443f1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cf81f020dd5e6ef4671ccb7591c2fdbaac12445901f2b71278a092f097daa7e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-czjpr" podUID="e2f3a294-0f1d-4058-aea4-a5b3b7a443f1" May 16 16:42:39.807909 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3157776792.mount: Deactivated successfully. May 16 16:42:39.972415 containerd[1610]: time="2025-05-16T16:42:39.956988088Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:40.039955 containerd[1610]: time="2025-05-16T16:42:40.039919548Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 16 16:42:40.070393 containerd[1610]: time="2025-05-16T16:42:40.070305552Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:40.082477 containerd[1610]: time="2025-05-16T16:42:40.082425859Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:40.084307 containerd[1610]: time="2025-05-16T16:42:40.084191806Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 5.343208601s" May 16 16:42:40.084307 containerd[1610]: time="2025-05-16T16:42:40.084222815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 16 16:42:40.164073 containerd[1610]: time="2025-05-16T16:42:40.164039275Z" level=info msg="CreateContainer within sandbox \"a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 16 16:42:40.755237 containerd[1610]: time="2025-05-16T16:42:40.754897319Z" level=info msg="Container 86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:40.755766 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1497990937.mount: Deactivated successfully. May 16 16:42:40.970802 containerd[1610]: time="2025-05-16T16:42:40.970773507Z" level=info msg="CreateContainer within sandbox \"a86c156c375ba422189c1bcf3e9ad25de61273144e7c112ab08f2f1be3abcdce\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853\"" May 16 16:42:40.971318 containerd[1610]: time="2025-05-16T16:42:40.971187127Z" level=info msg="StartContainer for \"86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853\"" May 16 16:42:40.982268 containerd[1610]: time="2025-05-16T16:42:40.982222354Z" level=info msg="connecting to shim 86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853" address="unix:///run/containerd/s/ce39f98048d7740492ab910116f08c53955463e9dcdc37b3477d6dc2ee7a8cff" protocol=ttrpc version=3 May 16 16:42:41.143865 systemd[1]: Started cri-containerd-86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853.scope - libcontainer container 86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853. May 16 16:42:41.198997 containerd[1610]: time="2025-05-16T16:42:41.198973990Z" level=info msg="StartContainer for \"86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853\" returns successfully" May 16 16:42:41.456128 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 16 16:42:41.457959 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 16 16:42:41.816789 kubelet[2904]: I0516 16:42:41.816523 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-njv75" podStartSLOduration=2.180053408 podStartE2EDuration="18.816511384s" podCreationTimestamp="2025-05-16 16:42:23 +0000 UTC" firstStartedPulling="2025-05-16 16:42:23.450287113 +0000 UTC m=+16.000538639" lastFinishedPulling="2025-05-16 16:42:40.086745086 +0000 UTC m=+32.636996615" observedRunningTime="2025-05-16 16:42:41.810055335 +0000 UTC m=+34.360306870" watchObservedRunningTime="2025-05-16 16:42:41.816511384 +0000 UTC m=+34.366762914" May 16 16:42:41.998393 containerd[1610]: time="2025-05-16T16:42:41.998342905Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853\" id:\"f2a20c81d1ae88c48650198bc57e460c4c905fa39475181049a6a5eb4e7fb68a\" pid:3991 exit_status:1 exited_at:{seconds:1747413761 nanos:993113667}" May 16 16:42:42.108921 kubelet[2904]: I0516 16:42:42.108517 2904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-whisker-backend-key-pair\") pod \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\" (UID: \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\") " May 16 16:42:42.110622 kubelet[2904]: I0516 16:42:42.109070 2904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-whisker-ca-bundle\") pod \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\" (UID: \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\") " May 16 16:42:42.110622 kubelet[2904]: I0516 16:42:42.109089 2904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jks2l\" (UniqueName: \"kubernetes.io/projected/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-kube-api-access-jks2l\") pod \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\" (UID: \"c1703aa4-4ec2-4b1a-8bb3-ef63b2681873\") " May 16 16:42:42.111292 kubelet[2904]: I0516 16:42:42.111268 2904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "c1703aa4-4ec2-4b1a-8bb3-ef63b2681873" (UID: "c1703aa4-4ec2-4b1a-8bb3-ef63b2681873"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 16 16:42:42.127673 systemd[1]: var-lib-kubelet-pods-c1703aa4\x2d4ec2\x2d4b1a\x2d8bb3\x2def63b2681873-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2djks2l.mount: Deactivated successfully. May 16 16:42:42.128944 kubelet[2904]: I0516 16:42:42.128921 2904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-kube-api-access-jks2l" (OuterVolumeSpecName: "kube-api-access-jks2l") pod "c1703aa4-4ec2-4b1a-8bb3-ef63b2681873" (UID: "c1703aa4-4ec2-4b1a-8bb3-ef63b2681873"). InnerVolumeSpecName "kube-api-access-jks2l". PluginName "kubernetes.io/projected", VolumeGidValue "" May 16 16:42:42.129772 kubelet[2904]: I0516 16:42:42.129756 2904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "c1703aa4-4ec2-4b1a-8bb3-ef63b2681873" (UID: "c1703aa4-4ec2-4b1a-8bb3-ef63b2681873"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" May 16 16:42:42.130577 systemd[1]: var-lib-kubelet-pods-c1703aa4\x2d4ec2\x2d4b1a\x2d8bb3\x2def63b2681873-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 16 16:42:42.209604 kubelet[2904]: I0516 16:42:42.209577 2904 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" May 16 16:42:42.209604 kubelet[2904]: I0516 16:42:42.209607 2904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jks2l\" (UniqueName: \"kubernetes.io/projected/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-kube-api-access-jks2l\") on node \"localhost\" DevicePath \"\"" May 16 16:42:42.209723 kubelet[2904]: I0516 16:42:42.209614 2904 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" May 16 16:42:42.768084 systemd[1]: Removed slice kubepods-besteffort-podc1703aa4_4ec2_4b1a_8bb3_ef63b2681873.slice - libcontainer container kubepods-besteffort-podc1703aa4_4ec2_4b1a_8bb3_ef63b2681873.slice. May 16 16:42:42.849072 systemd[1]: Created slice kubepods-besteffort-pod3f14b109_fa73_4313_9c07_3d3c314a58ba.slice - libcontainer container kubepods-besteffort-pod3f14b109_fa73_4313_9c07_3d3c314a58ba.slice. May 16 16:42:42.868480 containerd[1610]: time="2025-05-16T16:42:42.868449724Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853\" id:\"67d5044186cb2b2f54c611394678a7ef618df9013723fae65dc986795d544f85\" pid:4031 exit_status:1 exited_at:{seconds:1747413762 nanos:868201476}" May 16 16:42:43.015625 kubelet[2904]: I0516 16:42:43.015604 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82ddw\" (UniqueName: \"kubernetes.io/projected/3f14b109-fa73-4313-9c07-3d3c314a58ba-kube-api-access-82ddw\") pod \"whisker-54f476585f-nqgnj\" (UID: \"3f14b109-fa73-4313-9c07-3d3c314a58ba\") " pod="calico-system/whisker-54f476585f-nqgnj" May 16 16:42:43.017206 kubelet[2904]: I0516 16:42:43.017188 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3f14b109-fa73-4313-9c07-3d3c314a58ba-whisker-backend-key-pair\") pod \"whisker-54f476585f-nqgnj\" (UID: \"3f14b109-fa73-4313-9c07-3d3c314a58ba\") " pod="calico-system/whisker-54f476585f-nqgnj" May 16 16:42:43.017253 kubelet[2904]: I0516 16:42:43.017213 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f14b109-fa73-4313-9c07-3d3c314a58ba-whisker-ca-bundle\") pod \"whisker-54f476585f-nqgnj\" (UID: \"3f14b109-fa73-4313-9c07-3d3c314a58ba\") " pod="calico-system/whisker-54f476585f-nqgnj" May 16 16:42:43.155934 containerd[1610]: time="2025-05-16T16:42:43.155868402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54f476585f-nqgnj,Uid:3f14b109-fa73-4313-9c07-3d3c314a58ba,Namespace:calico-system,Attempt:0,}" May 16 16:42:43.452571 systemd-networkd[1541]: vxlan.calico: Link UP May 16 16:42:43.452575 systemd-networkd[1541]: vxlan.calico: Gained carrier May 16 16:42:43.614796 systemd-networkd[1541]: cali028135166fd: Link UP May 16 16:42:43.615940 systemd-networkd[1541]: cali028135166fd: Gained carrier May 16 16:42:43.631408 containerd[1610]: 2025-05-16 16:42:43.201 [INFO][4133] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 16 16:42:43.631408 containerd[1610]: 2025-05-16 16:42:43.257 [INFO][4133] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--54f476585f--nqgnj-eth0 whisker-54f476585f- calico-system 3f14b109-fa73-4313-9c07-3d3c314a58ba 918 0 2025-05-16 16:42:42 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:54f476585f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-54f476585f-nqgnj eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali028135166fd [] [] }} ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Namespace="calico-system" Pod="whisker-54f476585f-nqgnj" WorkloadEndpoint="localhost-k8s-whisker--54f476585f--nqgnj-" May 16 16:42:43.631408 containerd[1610]: 2025-05-16 16:42:43.258 [INFO][4133] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Namespace="calico-system" Pod="whisker-54f476585f-nqgnj" WorkloadEndpoint="localhost-k8s-whisker--54f476585f--nqgnj-eth0" May 16 16:42:43.631408 containerd[1610]: 2025-05-16 16:42:43.543 [INFO][4159] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" HandleID="k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Workload="localhost-k8s-whisker--54f476585f--nqgnj-eth0" May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.548 [INFO][4159] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" HandleID="k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Workload="localhost-k8s-whisker--54f476585f--nqgnj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000102580), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-54f476585f-nqgnj", "timestamp":"2025-05-16 16:42:43.543387058 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.548 [INFO][4159] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.548 [INFO][4159] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.549 [INFO][4159] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.569 [INFO][4159] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" host="localhost" May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.583 [INFO][4159] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.586 [INFO][4159] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.588 [INFO][4159] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.590 [INFO][4159] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:43.631575 containerd[1610]: 2025-05-16 16:42:43.590 [INFO][4159] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" host="localhost" May 16 16:42:43.633242 containerd[1610]: 2025-05-16 16:42:43.592 [INFO][4159] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa May 16 16:42:43.633242 containerd[1610]: 2025-05-16 16:42:43.596 [INFO][4159] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" host="localhost" May 16 16:42:43.633242 containerd[1610]: 2025-05-16 16:42:43.602 [INFO][4159] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" host="localhost" May 16 16:42:43.633242 containerd[1610]: 2025-05-16 16:42:43.602 [INFO][4159] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" host="localhost" May 16 16:42:43.633242 containerd[1610]: 2025-05-16 16:42:43.602 [INFO][4159] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:43.633242 containerd[1610]: 2025-05-16 16:42:43.602 [INFO][4159] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" HandleID="k8s-pod-network.ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Workload="localhost-k8s-whisker--54f476585f--nqgnj-eth0" May 16 16:42:43.634846 containerd[1610]: 2025-05-16 16:42:43.605 [INFO][4133] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Namespace="calico-system" Pod="whisker-54f476585f-nqgnj" WorkloadEndpoint="localhost-k8s-whisker--54f476585f--nqgnj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--54f476585f--nqgnj-eth0", GenerateName:"whisker-54f476585f-", Namespace:"calico-system", SelfLink:"", UID:"3f14b109-fa73-4313-9c07-3d3c314a58ba", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"54f476585f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-54f476585f-nqgnj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali028135166fd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:43.634846 containerd[1610]: 2025-05-16 16:42:43.605 [INFO][4133] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Namespace="calico-system" Pod="whisker-54f476585f-nqgnj" WorkloadEndpoint="localhost-k8s-whisker--54f476585f--nqgnj-eth0" May 16 16:42:43.634908 containerd[1610]: 2025-05-16 16:42:43.605 [INFO][4133] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali028135166fd ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Namespace="calico-system" Pod="whisker-54f476585f-nqgnj" WorkloadEndpoint="localhost-k8s-whisker--54f476585f--nqgnj-eth0" May 16 16:42:43.634908 containerd[1610]: 2025-05-16 16:42:43.617 [INFO][4133] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Namespace="calico-system" Pod="whisker-54f476585f-nqgnj" WorkloadEndpoint="localhost-k8s-whisker--54f476585f--nqgnj-eth0" May 16 16:42:43.634944 containerd[1610]: 2025-05-16 16:42:43.618 [INFO][4133] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Namespace="calico-system" Pod="whisker-54f476585f-nqgnj" WorkloadEndpoint="localhost-k8s-whisker--54f476585f--nqgnj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--54f476585f--nqgnj-eth0", GenerateName:"whisker-54f476585f-", Namespace:"calico-system", SelfLink:"", UID:"3f14b109-fa73-4313-9c07-3d3c314a58ba", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"54f476585f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa", Pod:"whisker-54f476585f-nqgnj", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali028135166fd", MAC:"e2:7d:95:b0:8e:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:43.634983 containerd[1610]: 2025-05-16 16:42:43.627 [INFO][4133] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" Namespace="calico-system" Pod="whisker-54f476585f-nqgnj" WorkloadEndpoint="localhost-k8s-whisker--54f476585f--nqgnj-eth0" May 16 16:42:43.636765 kubelet[2904]: I0516 16:42:43.635594 2904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1703aa4-4ec2-4b1a-8bb3-ef63b2681873" path="/var/lib/kubelet/pods/c1703aa4-4ec2-4b1a-8bb3-ef63b2681873/volumes" May 16 16:42:43.751353 containerd[1610]: time="2025-05-16T16:42:43.750500374Z" level=info msg="connecting to shim ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa" address="unix:///run/containerd/s/287c14c7d08b4e111297a40124e35d07990068297be84e5faa705aa88c331606" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:43.781151 systemd[1]: Started cri-containerd-ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa.scope - libcontainer container ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa. May 16 16:42:43.789793 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:43.821844 containerd[1610]: time="2025-05-16T16:42:43.821797800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54f476585f-nqgnj,Uid:3f14b109-fa73-4313-9c07-3d3c314a58ba,Namespace:calico-system,Attempt:0,} returns sandbox id \"ac1e2e89378d25a5c14933d69272038d22aeb02d4d590acdd5d9a394d9af6caa\"" May 16 16:42:43.822852 containerd[1610]: time="2025-05-16T16:42:43.822830235Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 16:42:44.145857 containerd[1610]: time="2025-05-16T16:42:44.145703136Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:42:44.149345 containerd[1610]: time="2025-05-16T16:42:44.149273836Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:42:44.149345 containerd[1610]: time="2025-05-16T16:42:44.149319816Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 16:42:44.149453 kubelet[2904]: E0516 16:42:44.149428 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:42:44.149670 kubelet[2904]: E0516 16:42:44.149466 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:42:44.155306 kubelet[2904]: E0516 16:42:44.155271 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c110a762b15a493fa59124ec728c2d93,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82ddw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54f476585f-nqgnj_calico-system(3f14b109-fa73-4313-9c07-3d3c314a58ba): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:42:44.157099 containerd[1610]: time="2025-05-16T16:42:44.157077276Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 16:42:44.421340 containerd[1610]: time="2025-05-16T16:42:44.421303660Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:42:44.421831 containerd[1610]: time="2025-05-16T16:42:44.421616844Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:42:44.421831 containerd[1610]: time="2025-05-16T16:42:44.421681301Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 16:42:44.421923 kubelet[2904]: E0516 16:42:44.421791 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:42:44.421923 kubelet[2904]: E0516 16:42:44.421822 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:42:44.421992 kubelet[2904]: E0516 16:42:44.421898 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82ddw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54f476585f-nqgnj_calico-system(3f14b109-fa73-4313-9c07-3d3c314a58ba): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:42:44.426913 kubelet[2904]: E0516 16:42:44.426831 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:42:44.620922 systemd-networkd[1541]: cali028135166fd: Gained IPv6LL May 16 16:42:44.684830 systemd-networkd[1541]: vxlan.calico: Gained IPv6LL May 16 16:42:44.795888 kubelet[2904]: E0516 16:42:44.795861 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:42:45.798571 kubelet[2904]: E0516 16:42:45.798533 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:42:46.597582 containerd[1610]: time="2025-05-16T16:42:46.597545463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bd5dbfdd-mxd6d,Uid:03784e1b-ebb0-4a4e-8922-7c6e649932aa,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:46.682247 systemd-networkd[1541]: cali96b73110625: Link UP May 16 16:42:46.682501 systemd-networkd[1541]: cali96b73110625: Gained carrier May 16 16:42:46.702450 containerd[1610]: 2025-05-16 16:42:46.634 [INFO][4320] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0 calico-apiserver-bd5dbfdd- calico-apiserver 03784e1b-ebb0-4a4e-8922-7c6e649932aa 844 0 2025-05-16 16:42:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:bd5dbfdd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-bd5dbfdd-mxd6d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali96b73110625 [] [] }} ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-mxd6d" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-" May 16 16:42:46.702450 containerd[1610]: 2025-05-16 16:42:46.634 [INFO][4320] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-mxd6d" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:42:46.702450 containerd[1610]: 2025-05-16 16:42:46.655 [INFO][4332] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.655 [INFO][4332] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002358d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-bd5dbfdd-mxd6d", "timestamp":"2025-05-16 16:42:46.655596966 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.655 [INFO][4332] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.655 [INFO][4332] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.655 [INFO][4332] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.660 [INFO][4332] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" host="localhost" May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.663 [INFO][4332] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.665 [INFO][4332] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.666 [INFO][4332] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.668 [INFO][4332] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:46.707810 containerd[1610]: 2025-05-16 16:42:46.668 [INFO][4332] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" host="localhost" May 16 16:42:46.707990 containerd[1610]: 2025-05-16 16:42:46.669 [INFO][4332] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac May 16 16:42:46.707990 containerd[1610]: 2025-05-16 16:42:46.672 [INFO][4332] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" host="localhost" May 16 16:42:46.707990 containerd[1610]: 2025-05-16 16:42:46.677 [INFO][4332] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" host="localhost" May 16 16:42:46.707990 containerd[1610]: 2025-05-16 16:42:46.677 [INFO][4332] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" host="localhost" May 16 16:42:46.707990 containerd[1610]: 2025-05-16 16:42:46.677 [INFO][4332] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:46.707990 containerd[1610]: 2025-05-16 16:42:46.677 [INFO][4332] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:42:46.708088 containerd[1610]: 2025-05-16 16:42:46.679 [INFO][4320] cni-plugin/k8s.go 418: Populated endpoint ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-mxd6d" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0", GenerateName:"calico-apiserver-bd5dbfdd-", Namespace:"calico-apiserver", SelfLink:"", UID:"03784e1b-ebb0-4a4e-8922-7c6e649932aa", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bd5dbfdd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-bd5dbfdd-mxd6d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali96b73110625", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:46.708127 containerd[1610]: 2025-05-16 16:42:46.680 [INFO][4320] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-mxd6d" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:42:46.708127 containerd[1610]: 2025-05-16 16:42:46.680 [INFO][4320] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali96b73110625 ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-mxd6d" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:42:46.708127 containerd[1610]: 2025-05-16 16:42:46.682 [INFO][4320] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-mxd6d" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:42:46.708176 containerd[1610]: 2025-05-16 16:42:46.684 [INFO][4320] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-mxd6d" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0", GenerateName:"calico-apiserver-bd5dbfdd-", Namespace:"calico-apiserver", SelfLink:"", UID:"03784e1b-ebb0-4a4e-8922-7c6e649932aa", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bd5dbfdd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac", Pod:"calico-apiserver-bd5dbfdd-mxd6d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali96b73110625", MAC:"b6:b5:6e:de:4b:4b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:46.708214 containerd[1610]: 2025-05-16 16:42:46.698 [INFO][4320] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-mxd6d" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:42:46.787199 containerd[1610]: time="2025-05-16T16:42:46.787143660Z" level=info msg="connecting to shim 005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" address="unix:///run/containerd/s/cbcfb9074dbfb1e8e05787b2a5b0967a6b5580db9b79ad1088c1562a2cca78de" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:46.809874 systemd[1]: Started cri-containerd-005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac.scope - libcontainer container 005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac. May 16 16:42:46.817959 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:46.844706 containerd[1610]: time="2025-05-16T16:42:46.844654256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bd5dbfdd-mxd6d,Uid:03784e1b-ebb0-4a4e-8922-7c6e649932aa,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\"" May 16 16:42:46.846814 containerd[1610]: time="2025-05-16T16:42:46.846223878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 16 16:42:47.579333 containerd[1610]: time="2025-05-16T16:42:47.579285543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5kps5,Uid:f5332b41-bb78-4eb1-8393-01391eebcbd1,Namespace:kube-system,Attempt:0,}" May 16 16:42:47.579432 containerd[1610]: time="2025-05-16T16:42:47.579390316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bd5dbfdd-7fnf8,Uid:1378257a-134f-443c-9608-a7936ba8cf76,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:47.579931 containerd[1610]: time="2025-05-16T16:42:47.579790890Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mckgq,Uid:bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd,Namespace:kube-system,Attempt:0,}" May 16 16:42:47.760786 systemd-networkd[1541]: cali65af1e78e6d: Link UP May 16 16:42:47.766275 systemd-networkd[1541]: cali65af1e78e6d: Gained carrier May 16 16:42:47.783333 containerd[1610]: 2025-05-16 16:42:47.684 [INFO][4399] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0 coredns-7c65d6cfc9- kube-system f5332b41-bb78-4eb1-8393-01391eebcbd1 843 0 2025-05-16 16:42:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-5kps5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali65af1e78e6d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5kps5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--5kps5-" May 16 16:42:47.783333 containerd[1610]: 2025-05-16 16:42:47.684 [INFO][4399] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5kps5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" May 16 16:42:47.783333 containerd[1610]: 2025-05-16 16:42:47.719 [INFO][4436] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" HandleID="k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Workload="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.719 [INFO][4436] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" HandleID="k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Workload="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9890), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-5kps5", "timestamp":"2025-05-16 16:42:47.71952693 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.719 [INFO][4436] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.719 [INFO][4436] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.719 [INFO][4436] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.729 [INFO][4436] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" host="localhost" May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.735 [INFO][4436] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.739 [INFO][4436] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.745 [INFO][4436] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.747 [INFO][4436] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:47.783661 containerd[1610]: 2025-05-16 16:42:47.747 [INFO][4436] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" host="localhost" May 16 16:42:47.784233 containerd[1610]: 2025-05-16 16:42:47.748 [INFO][4436] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43 May 16 16:42:47.784233 containerd[1610]: 2025-05-16 16:42:47.751 [INFO][4436] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" host="localhost" May 16 16:42:47.784233 containerd[1610]: 2025-05-16 16:42:47.755 [INFO][4436] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" host="localhost" May 16 16:42:47.784233 containerd[1610]: 2025-05-16 16:42:47.755 [INFO][4436] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" host="localhost" May 16 16:42:47.784233 containerd[1610]: 2025-05-16 16:42:47.755 [INFO][4436] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:47.784233 containerd[1610]: 2025-05-16 16:42:47.755 [INFO][4436] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" HandleID="k8s-pod-network.1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Workload="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" May 16 16:42:47.784533 containerd[1610]: 2025-05-16 16:42:47.757 [INFO][4399] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5kps5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f5332b41-bb78-4eb1-8393-01391eebcbd1", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-5kps5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali65af1e78e6d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:47.784614 containerd[1610]: 2025-05-16 16:42:47.757 [INFO][4399] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5kps5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" May 16 16:42:47.784614 containerd[1610]: 2025-05-16 16:42:47.757 [INFO][4399] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali65af1e78e6d ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5kps5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" May 16 16:42:47.784614 containerd[1610]: 2025-05-16 16:42:47.766 [INFO][4399] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5kps5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" May 16 16:42:47.784765 containerd[1610]: 2025-05-16 16:42:47.767 [INFO][4399] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5kps5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f5332b41-bb78-4eb1-8393-01391eebcbd1", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43", Pod:"coredns-7c65d6cfc9-5kps5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali65af1e78e6d", MAC:"76:a4:7d:3d:dd:1a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:47.784765 containerd[1610]: 2025-05-16 16:42:47.780 [INFO][4399] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5kps5" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--5kps5-eth0" May 16 16:42:47.804609 containerd[1610]: time="2025-05-16T16:42:47.804563205Z" level=info msg="connecting to shim 1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43" address="unix:///run/containerd/s/d0638079f71073483599d6524f2763621e7a546b6ffaa1ff29fe2829778f1ad2" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:47.829883 systemd[1]: Started cri-containerd-1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43.scope - libcontainer container 1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43. May 16 16:42:47.847406 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:47.916093 containerd[1610]: time="2025-05-16T16:42:47.916020010Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5kps5,Uid:f5332b41-bb78-4eb1-8393-01391eebcbd1,Namespace:kube-system,Attempt:0,} returns sandbox id \"1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43\"" May 16 16:42:47.932292 systemd-networkd[1541]: calie63306bbb26: Link UP May 16 16:42:47.932421 systemd-networkd[1541]: calie63306bbb26: Gained carrier May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.679 [INFO][4392] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0 calico-apiserver-bd5dbfdd- calico-apiserver 1378257a-134f-443c-9608-a7936ba8cf76 847 0 2025-05-16 16:42:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:bd5dbfdd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-bd5dbfdd-7fnf8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie63306bbb26 [] [] }} ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-7fnf8" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.679 [INFO][4392] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-7fnf8" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.741 [INFO][4430] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.741 [INFO][4430] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039b7a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-bd5dbfdd-7fnf8", "timestamp":"2025-05-16 16:42:47.74155622 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.741 [INFO][4430] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.755 [INFO][4430] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.755 [INFO][4430] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.829 [INFO][4430] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.837 [INFO][4430] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.885 [INFO][4430] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.887 [INFO][4430] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.893 [INFO][4430] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.893 [INFO][4430] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.895 [INFO][4430] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4 May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.915 [INFO][4430] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.923 [INFO][4430] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.923 [INFO][4430] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" host="localhost" May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.923 [INFO][4430] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:47.956139 containerd[1610]: 2025-05-16 16:42:47.923 [INFO][4430] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:42:47.960055 containerd[1610]: 2025-05-16 16:42:47.926 [INFO][4392] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-7fnf8" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0", GenerateName:"calico-apiserver-bd5dbfdd-", Namespace:"calico-apiserver", SelfLink:"", UID:"1378257a-134f-443c-9608-a7936ba8cf76", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bd5dbfdd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-bd5dbfdd-7fnf8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie63306bbb26", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:47.960055 containerd[1610]: 2025-05-16 16:42:47.927 [INFO][4392] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-7fnf8" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:42:47.960055 containerd[1610]: 2025-05-16 16:42:47.927 [INFO][4392] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie63306bbb26 ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-7fnf8" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:42:47.960055 containerd[1610]: 2025-05-16 16:42:47.931 [INFO][4392] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-7fnf8" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:42:47.960055 containerd[1610]: 2025-05-16 16:42:47.935 [INFO][4392] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-7fnf8" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0", GenerateName:"calico-apiserver-bd5dbfdd-", Namespace:"calico-apiserver", SelfLink:"", UID:"1378257a-134f-443c-9608-a7936ba8cf76", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"bd5dbfdd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4", Pod:"calico-apiserver-bd5dbfdd-7fnf8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie63306bbb26", MAC:"0a:a4:4d:ea:62:b3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:47.960055 containerd[1610]: 2025-05-16 16:42:47.953 [INFO][4392] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Namespace="calico-apiserver" Pod="calico-apiserver-bd5dbfdd-7fnf8" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:42:47.988013 systemd-networkd[1541]: calid7a340c660b: Link UP May 16 16:42:47.989707 systemd-networkd[1541]: calid7a340c660b: Gained carrier May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.686 [INFO][4396] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0 coredns-7c65d6cfc9- kube-system bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd 839 0 2025-05-16 16:42:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-mckgq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid7a340c660b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mckgq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--mckgq-" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.686 [INFO][4396] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mckgq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.745 [INFO][4441] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" HandleID="k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Workload="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.746 [INFO][4441] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" HandleID="k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Workload="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fab0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-mckgq", "timestamp":"2025-05-16 16:42:47.745661687 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.746 [INFO][4441] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.923 [INFO][4441] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.923 [INFO][4441] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.934 [INFO][4441] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.940 [INFO][4441] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.954 [INFO][4441] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.956 [INFO][4441] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.959 [INFO][4441] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.960 [INFO][4441] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.962 [INFO][4441] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3 May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.968 [INFO][4441] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.980 [INFO][4441] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.981 [INFO][4441] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" host="localhost" May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.981 [INFO][4441] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:48.005443 containerd[1610]: 2025-05-16 16:42:47.981 [INFO][4441] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" HandleID="k8s-pod-network.9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Workload="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" May 16 16:42:48.008437 containerd[1610]: 2025-05-16 16:42:47.982 [INFO][4396] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mckgq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-mckgq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid7a340c660b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:48.008437 containerd[1610]: 2025-05-16 16:42:47.983 [INFO][4396] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mckgq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" May 16 16:42:48.008437 containerd[1610]: 2025-05-16 16:42:47.983 [INFO][4396] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid7a340c660b ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mckgq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" May 16 16:42:48.008437 containerd[1610]: 2025-05-16 16:42:47.990 [INFO][4396] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mckgq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" May 16 16:42:48.008437 containerd[1610]: 2025-05-16 16:42:47.991 [INFO][4396] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mckgq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3", Pod:"coredns-7c65d6cfc9-mckgq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid7a340c660b", MAC:"9a:29:44:7a:01:c4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:48.008437 containerd[1610]: 2025-05-16 16:42:48.000 [INFO][4396] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mckgq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--mckgq-eth0" May 16 16:42:48.033056 containerd[1610]: time="2025-05-16T16:42:48.033005831Z" level=info msg="connecting to shim ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" address="unix:///run/containerd/s/f8a92d0c773f397b58166599474995ed2363c4ec16e9193373964ebb6eea1b5f" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:48.059196 containerd[1610]: time="2025-05-16T16:42:48.059169243Z" level=info msg="CreateContainer within sandbox \"1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 16 16:42:48.064514 systemd[1]: Started cri-containerd-ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4.scope - libcontainer container ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4. May 16 16:42:48.084192 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:48.115369 containerd[1610]: time="2025-05-16T16:42:48.115326078Z" level=info msg="connecting to shim 9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3" address="unix:///run/containerd/s/ea88d3847c6dd6c5edbc90b9d85aa3de98c1cfd894a0c914f5beff37ed1d638e" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:48.147760 systemd[1]: Started cri-containerd-9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3.scope - libcontainer container 9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3. May 16 16:42:48.162319 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:48.173769 containerd[1610]: time="2025-05-16T16:42:48.173744715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-bd5dbfdd-7fnf8,Uid:1378257a-134f-443c-9608-a7936ba8cf76,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\"" May 16 16:42:48.196168 containerd[1610]: time="2025-05-16T16:42:48.196143701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mckgq,Uid:bcfb93e8-c4e5-4104-8e91-9ce9dd0e68cd,Namespace:kube-system,Attempt:0,} returns sandbox id \"9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3\"" May 16 16:42:48.198124 containerd[1610]: time="2025-05-16T16:42:48.197900239Z" level=info msg="CreateContainer within sandbox \"9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 16 16:42:48.261220 containerd[1610]: time="2025-05-16T16:42:48.261179168Z" level=info msg="Container 567624b9111f060478f0d3078dc5ef9f4b05aba15c2903337b8ce6ae3e8b59bd: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:48.261499 containerd[1610]: time="2025-05-16T16:42:48.261479774Z" level=info msg="Container e418cf445faf389ddf842491a063ba0427e0e04c17818fdb14ec9c0c37f91639: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:48.288330 containerd[1610]: time="2025-05-16T16:42:48.288202988Z" level=info msg="CreateContainer within sandbox \"1ad53150cd196f238cdb6c18580feadc0f02db4638c6c69b3bc347aef4bf0e43\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e418cf445faf389ddf842491a063ba0427e0e04c17818fdb14ec9c0c37f91639\"" May 16 16:42:48.288748 containerd[1610]: time="2025-05-16T16:42:48.288734523Z" level=info msg="StartContainer for \"e418cf445faf389ddf842491a063ba0427e0e04c17818fdb14ec9c0c37f91639\"" May 16 16:42:48.295646 containerd[1610]: time="2025-05-16T16:42:48.295625003Z" level=info msg="connecting to shim e418cf445faf389ddf842491a063ba0427e0e04c17818fdb14ec9c0c37f91639" address="unix:///run/containerd/s/d0638079f71073483599d6524f2763621e7a546b6ffaa1ff29fe2829778f1ad2" protocol=ttrpc version=3 May 16 16:42:48.302736 containerd[1610]: time="2025-05-16T16:42:48.302690053Z" level=info msg="CreateContainer within sandbox \"9890ad6bb817231829e965b854333df3e01d400d2133b800db889d708dd428d3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"567624b9111f060478f0d3078dc5ef9f4b05aba15c2903337b8ce6ae3e8b59bd\"" May 16 16:42:48.304781 containerd[1610]: time="2025-05-16T16:42:48.304749258Z" level=info msg="StartContainer for \"567624b9111f060478f0d3078dc5ef9f4b05aba15c2903337b8ce6ae3e8b59bd\"" May 16 16:42:48.305355 containerd[1610]: time="2025-05-16T16:42:48.305337448Z" level=info msg="connecting to shim 567624b9111f060478f0d3078dc5ef9f4b05aba15c2903337b8ce6ae3e8b59bd" address="unix:///run/containerd/s/ea88d3847c6dd6c5edbc90b9d85aa3de98c1cfd894a0c914f5beff37ed1d638e" protocol=ttrpc version=3 May 16 16:42:48.314868 systemd[1]: Started cri-containerd-e418cf445faf389ddf842491a063ba0427e0e04c17818fdb14ec9c0c37f91639.scope - libcontainer container e418cf445faf389ddf842491a063ba0427e0e04c17818fdb14ec9c0c37f91639. May 16 16:42:48.330848 systemd[1]: Started cri-containerd-567624b9111f060478f0d3078dc5ef9f4b05aba15c2903337b8ce6ae3e8b59bd.scope - libcontainer container 567624b9111f060478f0d3078dc5ef9f4b05aba15c2903337b8ce6ae3e8b59bd. May 16 16:42:48.438464 containerd[1610]: time="2025-05-16T16:42:48.438434999Z" level=info msg="StartContainer for \"e418cf445faf389ddf842491a063ba0427e0e04c17818fdb14ec9c0c37f91639\" returns successfully" May 16 16:42:48.438673 containerd[1610]: time="2025-05-16T16:42:48.438523896Z" level=info msg="StartContainer for \"567624b9111f060478f0d3078dc5ef9f4b05aba15c2903337b8ce6ae3e8b59bd\" returns successfully" May 16 16:42:48.579178 containerd[1610]: time="2025-05-16T16:42:48.578542207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68574cbdb7-zks72,Uid:5816b0c4-629d-48da-91c2-b008ec0d38de,Namespace:calico-system,Attempt:0,}" May 16 16:42:48.588947 systemd-networkd[1541]: cali96b73110625: Gained IPv6LL May 16 16:42:48.597647 containerd[1610]: time="2025-05-16T16:42:48.597612711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567dc9f47-slfcz,Uid:ba5965bb-e11f-490d-8dee-db0aaabf3b26,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:48.600294 containerd[1610]: time="2025-05-16T16:42:48.600182701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-czjpr,Uid:e2f3a294-0f1d-4058-aea4-a5b3b7a443f1,Namespace:calico-system,Attempt:0,}" May 16 16:42:48.781188 systemd-networkd[1541]: cali65af1e78e6d: Gained IPv6LL May 16 16:42:48.796550 systemd-networkd[1541]: cali61537305119: Link UP May 16 16:42:48.797533 systemd-networkd[1541]: cali61537305119: Gained carrier May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.653 [INFO][4667] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0 calico-kube-controllers-68574cbdb7- calico-system 5816b0c4-629d-48da-91c2-b008ec0d38de 848 0 2025-05-16 16:42:23 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:68574cbdb7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-68574cbdb7-zks72 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali61537305119 [] [] }} ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Namespace="calico-system" Pod="calico-kube-controllers-68574cbdb7-zks72" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.653 [INFO][4667] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Namespace="calico-system" Pod="calico-kube-controllers-68574cbdb7-zks72" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.707 [INFO][4700] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" HandleID="k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Workload="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.707 [INFO][4700] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" HandleID="k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Workload="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000234f80), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-68574cbdb7-zks72", "timestamp":"2025-05-16 16:42:48.707814201 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.708 [INFO][4700] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.708 [INFO][4700] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.708 [INFO][4700] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.714 [INFO][4700] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.719 [INFO][4700] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.747 [INFO][4700] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.749 [INFO][4700] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.751 [INFO][4700] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.751 [INFO][4700] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.755 [INFO][4700] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9 May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.771 [INFO][4700] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.788 [INFO][4700] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.788 [INFO][4700] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" host="localhost" May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.788 [INFO][4700] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:48.827576 containerd[1610]: 2025-05-16 16:42:48.788 [INFO][4700] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" HandleID="k8s-pod-network.96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Workload="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" May 16 16:42:48.862302 containerd[1610]: 2025-05-16 16:42:48.792 [INFO][4667] cni-plugin/k8s.go 418: Populated endpoint ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Namespace="calico-system" Pod="calico-kube-controllers-68574cbdb7-zks72" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0", GenerateName:"calico-kube-controllers-68574cbdb7-", Namespace:"calico-system", SelfLink:"", UID:"5816b0c4-629d-48da-91c2-b008ec0d38de", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68574cbdb7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-68574cbdb7-zks72", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali61537305119", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:48.862302 containerd[1610]: 2025-05-16 16:42:48.792 [INFO][4667] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Namespace="calico-system" Pod="calico-kube-controllers-68574cbdb7-zks72" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" May 16 16:42:48.862302 containerd[1610]: 2025-05-16 16:42:48.792 [INFO][4667] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali61537305119 ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Namespace="calico-system" Pod="calico-kube-controllers-68574cbdb7-zks72" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" May 16 16:42:48.862302 containerd[1610]: 2025-05-16 16:42:48.798 [INFO][4667] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Namespace="calico-system" Pod="calico-kube-controllers-68574cbdb7-zks72" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" May 16 16:42:48.862302 containerd[1610]: 2025-05-16 16:42:48.799 [INFO][4667] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Namespace="calico-system" Pod="calico-kube-controllers-68574cbdb7-zks72" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0", GenerateName:"calico-kube-controllers-68574cbdb7-", Namespace:"calico-system", SelfLink:"", UID:"5816b0c4-629d-48da-91c2-b008ec0d38de", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68574cbdb7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9", Pod:"calico-kube-controllers-68574cbdb7-zks72", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali61537305119", MAC:"96:45:0d:3c:37:d1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:48.862302 containerd[1610]: 2025-05-16 16:42:48.823 [INFO][4667] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" Namespace="calico-system" Pod="calico-kube-controllers-68574cbdb7-zks72" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68574cbdb7--zks72-eth0" May 16 16:42:48.911540 systemd-networkd[1541]: calia7886a33032: Link UP May 16 16:42:48.912668 systemd-networkd[1541]: calia7886a33032: Gained carrier May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.687 [INFO][4682] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0 calico-apiserver-567dc9f47- calico-apiserver ba5965bb-e11f-490d-8dee-db0aaabf3b26 849 0 2025-05-16 16:42:21 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:567dc9f47 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-567dc9f47-slfcz eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia7886a33032 [] [] }} ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-slfcz" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--slfcz-" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.687 [INFO][4682] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-slfcz" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.758 [INFO][4708] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" HandleID="k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Workload="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.758 [INFO][4708] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" HandleID="k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Workload="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d39f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-567dc9f47-slfcz", "timestamp":"2025-05-16 16:42:48.758128455 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.758 [INFO][4708] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.788 [INFO][4708] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.788 [INFO][4708] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.814 [INFO][4708] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.829 [INFO][4708] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.858 [INFO][4708] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.869 [INFO][4708] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.873 [INFO][4708] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.873 [INFO][4708] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.875 [INFO][4708] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196 May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.884 [INFO][4708] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.904 [INFO][4708] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.904 [INFO][4708] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" host="localhost" May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.904 [INFO][4708] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:48.932902 containerd[1610]: 2025-05-16 16:42:48.904 [INFO][4708] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" HandleID="k8s-pod-network.5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Workload="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" May 16 16:42:48.938005 containerd[1610]: 2025-05-16 16:42:48.907 [INFO][4682] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-slfcz" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0", GenerateName:"calico-apiserver-567dc9f47-", Namespace:"calico-apiserver", SelfLink:"", UID:"ba5965bb-e11f-490d-8dee-db0aaabf3b26", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"567dc9f47", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-567dc9f47-slfcz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia7886a33032", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:48.938005 containerd[1610]: 2025-05-16 16:42:48.907 [INFO][4682] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-slfcz" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" May 16 16:42:48.938005 containerd[1610]: 2025-05-16 16:42:48.907 [INFO][4682] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia7886a33032 ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-slfcz" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" May 16 16:42:48.938005 containerd[1610]: 2025-05-16 16:42:48.913 [INFO][4682] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-slfcz" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" May 16 16:42:48.938005 containerd[1610]: 2025-05-16 16:42:48.913 [INFO][4682] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-slfcz" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0", GenerateName:"calico-apiserver-567dc9f47-", Namespace:"calico-apiserver", SelfLink:"", UID:"ba5965bb-e11f-490d-8dee-db0aaabf3b26", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"567dc9f47", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196", Pod:"calico-apiserver-567dc9f47-slfcz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia7886a33032", MAC:"06:2d:e0:f9:a7:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:48.938005 containerd[1610]: 2025-05-16 16:42:48.928 [INFO][4682] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-slfcz" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--slfcz-eth0" May 16 16:42:48.986867 kubelet[2904]: I0516 16:42:48.895457 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-mckgq" podStartSLOduration=37.895444594 podStartE2EDuration="37.895444594s" podCreationTimestamp="2025-05-16 16:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:48.87076535 +0000 UTC m=+41.421016885" watchObservedRunningTime="2025-05-16 16:42:48.895444594 +0000 UTC m=+41.445696125" May 16 16:42:49.078936 kubelet[2904]: I0516 16:42:48.987282 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-5kps5" podStartSLOduration=37.987266149999996 podStartE2EDuration="37.98726615s" podCreationTimestamp="2025-05-16 16:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:48.987132204 +0000 UTC m=+41.537383731" watchObservedRunningTime="2025-05-16 16:42:48.98726615 +0000 UTC m=+41.537517686" May 16 16:42:49.083444 systemd-networkd[1541]: cali5357cda49da: Link UP May 16 16:42:49.084020 systemd-networkd[1541]: cali5357cda49da: Gained carrier May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.715 [INFO][4694] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--czjpr-eth0 csi-node-driver- calico-system e2f3a294-0f1d-4058-aea4-a5b3b7a443f1 729 0 2025-05-16 16:42:23 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:68bf44dd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-czjpr eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5357cda49da [] [] }} ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Namespace="calico-system" Pod="csi-node-driver-czjpr" WorkloadEndpoint="localhost-k8s-csi--node--driver--czjpr-" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.715 [INFO][4694] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Namespace="calico-system" Pod="csi-node-driver-czjpr" WorkloadEndpoint="localhost-k8s-csi--node--driver--czjpr-eth0" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.762 [INFO][4716] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" HandleID="k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Workload="localhost-k8s-csi--node--driver--czjpr-eth0" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.762 [INFO][4716] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" HandleID="k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Workload="localhost-k8s-csi--node--driver--czjpr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000333270), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-czjpr", "timestamp":"2025-05-16 16:42:48.762521554 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.762 [INFO][4716] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.904 [INFO][4716] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.905 [INFO][4716] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.921 [INFO][4716] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.931 [INFO][4716] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.943 [INFO][4716] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.944 [INFO][4716] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.946 [INFO][4716] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.946 [INFO][4716] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.947 [INFO][4716] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5 May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.953 [INFO][4716] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:48.980 [INFO][4716] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:49.078 [INFO][4716] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" host="localhost" May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:49.078 [INFO][4716] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:49.114405 containerd[1610]: 2025-05-16 16:42:49.078 [INFO][4716] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" HandleID="k8s-pod-network.562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Workload="localhost-k8s-csi--node--driver--czjpr-eth0" May 16 16:42:49.115290 containerd[1610]: 2025-05-16 16:42:49.081 [INFO][4694] cni-plugin/k8s.go 418: Populated endpoint ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Namespace="calico-system" Pod="csi-node-driver-czjpr" WorkloadEndpoint="localhost-k8s-csi--node--driver--czjpr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--czjpr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e2f3a294-0f1d-4058-aea4-a5b3b7a443f1", ResourceVersion:"729", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-czjpr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5357cda49da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:49.115290 containerd[1610]: 2025-05-16 16:42:49.081 [INFO][4694] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Namespace="calico-system" Pod="csi-node-driver-czjpr" WorkloadEndpoint="localhost-k8s-csi--node--driver--czjpr-eth0" May 16 16:42:49.115290 containerd[1610]: 2025-05-16 16:42:49.081 [INFO][4694] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5357cda49da ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Namespace="calico-system" Pod="csi-node-driver-czjpr" WorkloadEndpoint="localhost-k8s-csi--node--driver--czjpr-eth0" May 16 16:42:49.115290 containerd[1610]: 2025-05-16 16:42:49.083 [INFO][4694] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Namespace="calico-system" Pod="csi-node-driver-czjpr" WorkloadEndpoint="localhost-k8s-csi--node--driver--czjpr-eth0" May 16 16:42:49.115290 containerd[1610]: 2025-05-16 16:42:49.084 [INFO][4694] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Namespace="calico-system" Pod="csi-node-driver-czjpr" WorkloadEndpoint="localhost-k8s-csi--node--driver--czjpr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--czjpr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e2f3a294-0f1d-4058-aea4-a5b3b7a443f1", ResourceVersion:"729", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5", Pod:"csi-node-driver-czjpr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5357cda49da", MAC:"3a:0a:d7:af:8f:6f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:49.115290 containerd[1610]: 2025-05-16 16:42:49.108 [INFO][4694] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" Namespace="calico-system" Pod="csi-node-driver-czjpr" WorkloadEndpoint="localhost-k8s-csi--node--driver--czjpr-eth0" May 16 16:42:49.176153 containerd[1610]: time="2025-05-16T16:42:49.176077014Z" level=info msg="connecting to shim 5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196" address="unix:///run/containerd/s/c8b833ccc27ea4cf00ed2087f07091402096884d27db61364380d6701d9505f7" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:49.180239 containerd[1610]: time="2025-05-16T16:42:49.180162046Z" level=info msg="connecting to shim 562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5" address="unix:///run/containerd/s/df37aeac530c807308efd65ce3cb6e1805431f65038e51f6426178e2a0b74d67" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:49.188212 containerd[1610]: time="2025-05-16T16:42:49.188151363Z" level=info msg="connecting to shim 96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9" address="unix:///run/containerd/s/ea0095ed3a9670b113d8f68037a50109ad3b1d00d5999a8a72299004823ddbf7" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:49.252978 systemd[1]: Started cri-containerd-5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196.scope - libcontainer container 5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196. May 16 16:42:49.275108 systemd[1]: Started cri-containerd-96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9.scope - libcontainer container 96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9. May 16 16:42:49.296942 systemd[1]: Started cri-containerd-562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5.scope - libcontainer container 562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5. May 16 16:42:49.348915 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:49.363908 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:49.367458 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:49.402245 containerd[1610]: time="2025-05-16T16:42:49.402209964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-czjpr,Uid:e2f3a294-0f1d-4058-aea4-a5b3b7a443f1,Namespace:calico-system,Attempt:0,} returns sandbox id \"562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5\"" May 16 16:42:49.442754 containerd[1610]: time="2025-05-16T16:42:49.442703316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567dc9f47-slfcz,Uid:ba5965bb-e11f-490d-8dee-db0aaabf3b26,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196\"" May 16 16:42:49.445045 containerd[1610]: time="2025-05-16T16:42:49.443186889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68574cbdb7-zks72,Uid:5816b0c4-629d-48da-91c2-b008ec0d38de,Namespace:calico-system,Attempt:0,} returns sandbox id \"96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9\"" May 16 16:42:49.615291 systemd-networkd[1541]: calie63306bbb26: Gained IPv6LL May 16 16:42:49.805255 systemd-networkd[1541]: calid7a340c660b: Gained IPv6LL May 16 16:42:50.188878 systemd-networkd[1541]: calia7886a33032: Gained IPv6LL May 16 16:42:50.189512 systemd-networkd[1541]: cali61537305119: Gained IPv6LL May 16 16:42:50.239748 containerd[1610]: time="2025-05-16T16:42:50.239697415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:50.240356 containerd[1610]: time="2025-05-16T16:42:50.240330208Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 16 16:42:50.241142 containerd[1610]: time="2025-05-16T16:42:50.241100866Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:50.242541 containerd[1610]: time="2025-05-16T16:42:50.242508013Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:50.243387 containerd[1610]: time="2025-05-16T16:42:50.243037491Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 3.396244777s" May 16 16:42:50.243387 containerd[1610]: time="2025-05-16T16:42:50.243064065Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 16 16:42:50.243745 containerd[1610]: time="2025-05-16T16:42:50.243721169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 16 16:42:50.250765 containerd[1610]: time="2025-05-16T16:42:50.250719130Z" level=info msg="CreateContainer within sandbox \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 16 16:42:50.255777 containerd[1610]: time="2025-05-16T16:42:50.254175440Z" level=info msg="Container a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:50.266950 containerd[1610]: time="2025-05-16T16:42:50.266864202Z" level=info msg="CreateContainer within sandbox \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\"" May 16 16:42:50.267393 containerd[1610]: time="2025-05-16T16:42:50.267370182Z" level=info msg="StartContainer for \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\"" May 16 16:42:50.268687 containerd[1610]: time="2025-05-16T16:42:50.268652579Z" level=info msg="connecting to shim a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab" address="unix:///run/containerd/s/cbcfb9074dbfb1e8e05787b2a5b0967a6b5580db9b79ad1088c1562a2cca78de" protocol=ttrpc version=3 May 16 16:42:50.290927 systemd[1]: Started cri-containerd-a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab.scope - libcontainer container a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab. May 16 16:42:50.325514 containerd[1610]: time="2025-05-16T16:42:50.325478223Z" level=info msg="StartContainer for \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" returns successfully" May 16 16:42:50.578003 containerd[1610]: time="2025-05-16T16:42:50.577626739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-7x2n4,Uid:f47c5b55-7da4-4cde-9213-a21e48a7736b,Namespace:calico-system,Attempt:0,}" May 16 16:42:50.646430 containerd[1610]: time="2025-05-16T16:42:50.646334617Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:50.648626 containerd[1610]: time="2025-05-16T16:42:50.648596350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 16 16:42:50.652449 containerd[1610]: time="2025-05-16T16:42:50.652411475Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 408.601848ms" May 16 16:42:50.652743 containerd[1610]: time="2025-05-16T16:42:50.652545513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 16 16:42:50.654466 containerd[1610]: time="2025-05-16T16:42:50.654444947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 16 16:42:50.655418 containerd[1610]: time="2025-05-16T16:42:50.654942916Z" level=info msg="CreateContainer within sandbox \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 16 16:42:50.662972 containerd[1610]: time="2025-05-16T16:42:50.662942194Z" level=info msg="Container 65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:50.671673 containerd[1610]: time="2025-05-16T16:42:50.671562195Z" level=info msg="CreateContainer within sandbox \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\"" May 16 16:42:50.672660 containerd[1610]: time="2025-05-16T16:42:50.672631696Z" level=info msg="StartContainer for \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\"" May 16 16:42:50.676672 containerd[1610]: time="2025-05-16T16:42:50.676637359Z" level=info msg="connecting to shim 65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac" address="unix:///run/containerd/s/f8a92d0c773f397b58166599474995ed2363c4ec16e9193373964ebb6eea1b5f" protocol=ttrpc version=3 May 16 16:42:50.700714 systemd[1]: Started cri-containerd-65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac.scope - libcontainer container 65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac. May 16 16:42:50.785260 containerd[1610]: time="2025-05-16T16:42:50.785207955Z" level=info msg="StartContainer for \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" returns successfully" May 16 16:42:50.842982 systemd-networkd[1541]: cali568912ba193: Link UP May 16 16:42:50.843159 systemd-networkd[1541]: cali568912ba193: Gained carrier May 16 16:42:50.860089 kubelet[2904]: I0516 16:42:50.859770 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-bd5dbfdd-7fnf8" podStartSLOduration=28.380962922 podStartE2EDuration="30.859083829s" podCreationTimestamp="2025-05-16 16:42:20 +0000 UTC" firstStartedPulling="2025-05-16 16:42:48.174933138 +0000 UTC m=+40.725184665" lastFinishedPulling="2025-05-16 16:42:50.653054045 +0000 UTC m=+43.203305572" observedRunningTime="2025-05-16 16:42:50.847498648 +0000 UTC m=+43.397750183" watchObservedRunningTime="2025-05-16 16:42:50.859083829 +0000 UTC m=+43.409335359" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.714 [INFO][4959] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0 goldmane-8f77d7b6c- calico-system f47c5b55-7da4-4cde-9213-a21e48a7736b 846 0 2025-05-16 16:42:22 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:8f77d7b6c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-8f77d7b6c-7x2n4 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali568912ba193 [] [] }} ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Namespace="calico-system" Pod="goldmane-8f77d7b6c-7x2n4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--7x2n4-" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.717 [INFO][4959] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Namespace="calico-system" Pod="goldmane-8f77d7b6c-7x2n4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.783 [INFO][4991] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" HandleID="k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Workload="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.784 [INFO][4991] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" HandleID="k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Workload="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ac460), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-8f77d7b6c-7x2n4", "timestamp":"2025-05-16 16:42:50.783260826 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.784 [INFO][4991] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.784 [INFO][4991] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.784 [INFO][4991] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.792 [INFO][4991] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.798 [INFO][4991] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.802 [INFO][4991] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.804 [INFO][4991] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.806 [INFO][4991] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.806 [INFO][4991] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.807 [INFO][4991] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585 May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.810 [INFO][4991] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.817 [INFO][4991] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.818 [INFO][4991] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" host="localhost" May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.818 [INFO][4991] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:50.863368 containerd[1610]: 2025-05-16 16:42:50.818 [INFO][4991] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" HandleID="k8s-pod-network.e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Workload="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" May 16 16:42:50.867791 containerd[1610]: 2025-05-16 16:42:50.827 [INFO][4959] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Namespace="calico-system" Pod="goldmane-8f77d7b6c-7x2n4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"f47c5b55-7da4-4cde-9213-a21e48a7736b", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-8f77d7b6c-7x2n4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali568912ba193", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:50.867791 containerd[1610]: 2025-05-16 16:42:50.828 [INFO][4959] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Namespace="calico-system" Pod="goldmane-8f77d7b6c-7x2n4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" May 16 16:42:50.867791 containerd[1610]: 2025-05-16 16:42:50.828 [INFO][4959] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali568912ba193 ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Namespace="calico-system" Pod="goldmane-8f77d7b6c-7x2n4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" May 16 16:42:50.867791 containerd[1610]: 2025-05-16 16:42:50.842 [INFO][4959] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Namespace="calico-system" Pod="goldmane-8f77d7b6c-7x2n4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" May 16 16:42:50.867791 containerd[1610]: 2025-05-16 16:42:50.842 [INFO][4959] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Namespace="calico-system" Pod="goldmane-8f77d7b6c-7x2n4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"f47c5b55-7da4-4cde-9213-a21e48a7736b", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585", Pod:"goldmane-8f77d7b6c-7x2n4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali568912ba193", MAC:"5e:0d:75:01:ad:a9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:50.867791 containerd[1610]: 2025-05-16 16:42:50.857 [INFO][4959] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" Namespace="calico-system" Pod="goldmane-8f77d7b6c-7x2n4" WorkloadEndpoint="localhost-k8s-goldmane--8f77d7b6c--7x2n4-eth0" May 16 16:42:50.886039 kubelet[2904]: I0516 16:42:50.885722 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-bd5dbfdd-mxd6d" podStartSLOduration=27.487929329 podStartE2EDuration="30.885613983s" podCreationTimestamp="2025-05-16 16:42:20 +0000 UTC" firstStartedPulling="2025-05-16 16:42:46.845909696 +0000 UTC m=+39.396161222" lastFinishedPulling="2025-05-16 16:42:50.243594347 +0000 UTC m=+42.793845876" observedRunningTime="2025-05-16 16:42:50.878247523 +0000 UTC m=+43.428499058" watchObservedRunningTime="2025-05-16 16:42:50.885613983 +0000 UTC m=+43.435865517" May 16 16:42:50.925456 containerd[1610]: time="2025-05-16T16:42:50.925272646Z" level=info msg="connecting to shim e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585" address="unix:///run/containerd/s/41c3c270be3ff1cbb51e3284ef6f34a00ba4b03488e175f75a3e91efaef57f41" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:50.954907 systemd[1]: Started cri-containerd-e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585.scope - libcontainer container e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585. May 16 16:42:50.983232 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:51.020853 systemd-networkd[1541]: cali5357cda49da: Gained IPv6LL May 16 16:42:51.050346 containerd[1610]: time="2025-05-16T16:42:51.050270320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-7x2n4,Uid:f47c5b55-7da4-4cde-9213-a21e48a7736b,Namespace:calico-system,Attempt:0,} returns sandbox id \"e7dd34a137ce389e4c9ee32fd7c65a44071b3d77f3f070ebba68c0987bfd2585\"" May 16 16:42:52.257288 containerd[1610]: time="2025-05-16T16:42:52.255964802Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:52.259640 containerd[1610]: time="2025-05-16T16:42:52.259593361Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:52.259932 containerd[1610]: time="2025-05-16T16:42:52.259916080Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 16 16:42:52.265821 containerd[1610]: time="2025-05-16T16:42:52.265705580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:52.266351 containerd[1610]: time="2025-05-16T16:42:52.266327269Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 1.611772853s" May 16 16:42:52.266351 containerd[1610]: time="2025-05-16T16:42:52.266351597Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 16 16:42:52.270423 containerd[1610]: time="2025-05-16T16:42:52.270396450Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 16 16:42:52.277375 containerd[1610]: time="2025-05-16T16:42:52.276723481Z" level=info msg="CreateContainer within sandbox \"562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 16 16:42:52.369671 containerd[1610]: time="2025-05-16T16:42:52.368972997Z" level=info msg="Container 628c4650406287321ec073fafd4d214c15b8ef97838512733b007f438dbe2830: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:52.371206 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2070793892.mount: Deactivated successfully. May 16 16:42:52.409822 containerd[1610]: time="2025-05-16T16:42:52.409789246Z" level=info msg="CreateContainer within sandbox \"562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"628c4650406287321ec073fafd4d214c15b8ef97838512733b007f438dbe2830\"" May 16 16:42:52.410524 containerd[1610]: time="2025-05-16T16:42:52.410503281Z" level=info msg="StartContainer for \"628c4650406287321ec073fafd4d214c15b8ef97838512733b007f438dbe2830\"" May 16 16:42:52.411749 containerd[1610]: time="2025-05-16T16:42:52.411688329Z" level=info msg="connecting to shim 628c4650406287321ec073fafd4d214c15b8ef97838512733b007f438dbe2830" address="unix:///run/containerd/s/df37aeac530c807308efd65ce3cb6e1805431f65038e51f6426178e2a0b74d67" protocol=ttrpc version=3 May 16 16:42:52.482883 systemd[1]: Started cri-containerd-628c4650406287321ec073fafd4d214c15b8ef97838512733b007f438dbe2830.scope - libcontainer container 628c4650406287321ec073fafd4d214c15b8ef97838512733b007f438dbe2830. May 16 16:42:52.520421 containerd[1610]: time="2025-05-16T16:42:52.520162432Z" level=info msg="StartContainer for \"628c4650406287321ec073fafd4d214c15b8ef97838512733b007f438dbe2830\" returns successfully" May 16 16:42:52.621828 systemd-networkd[1541]: cali568912ba193: Gained IPv6LL May 16 16:42:52.683179 containerd[1610]: time="2025-05-16T16:42:52.683147606Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:52.683996 containerd[1610]: time="2025-05-16T16:42:52.683973667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 16 16:42:52.693991 containerd[1610]: time="2025-05-16T16:42:52.693960707Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 423.535043ms" May 16 16:42:52.693991 containerd[1610]: time="2025-05-16T16:42:52.693989996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 16 16:42:52.696072 containerd[1610]: time="2025-05-16T16:42:52.694649007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 16 16:42:52.697801 containerd[1610]: time="2025-05-16T16:42:52.696764429Z" level=info msg="CreateContainer within sandbox \"5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 16 16:42:52.701746 containerd[1610]: time="2025-05-16T16:42:52.701507077Z" level=info msg="Container ba8fb292df849634c583f9d038bde90522891b756c635eb411a9fdc35ced2e37: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:52.704520 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2006220377.mount: Deactivated successfully. May 16 16:42:52.716040 containerd[1610]: time="2025-05-16T16:42:52.716007981Z" level=info msg="CreateContainer within sandbox \"5793cf4fb5503a805dd49a057e25a25d3a380a1e4539536e9de35c59f095c196\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ba8fb292df849634c583f9d038bde90522891b756c635eb411a9fdc35ced2e37\"" May 16 16:42:52.716626 containerd[1610]: time="2025-05-16T16:42:52.716577961Z" level=info msg="StartContainer for \"ba8fb292df849634c583f9d038bde90522891b756c635eb411a9fdc35ced2e37\"" May 16 16:42:52.717563 containerd[1610]: time="2025-05-16T16:42:52.717539305Z" level=info msg="connecting to shim ba8fb292df849634c583f9d038bde90522891b756c635eb411a9fdc35ced2e37" address="unix:///run/containerd/s/c8b833ccc27ea4cf00ed2087f07091402096884d27db61364380d6701d9505f7" protocol=ttrpc version=3 May 16 16:42:52.730855 systemd[1]: Started cri-containerd-ba8fb292df849634c583f9d038bde90522891b756c635eb411a9fdc35ced2e37.scope - libcontainer container ba8fb292df849634c583f9d038bde90522891b756c635eb411a9fdc35ced2e37. May 16 16:42:52.771870 containerd[1610]: time="2025-05-16T16:42:52.771776528Z" level=info msg="StartContainer for \"ba8fb292df849634c583f9d038bde90522891b756c635eb411a9fdc35ced2e37\" returns successfully" May 16 16:42:52.873673 kubelet[2904]: I0516 16:42:52.873374 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-567dc9f47-slfcz" podStartSLOduration=28.623242085 podStartE2EDuration="31.87335619s" podCreationTimestamp="2025-05-16 16:42:21 +0000 UTC" firstStartedPulling="2025-05-16 16:42:49.444482434 +0000 UTC m=+41.994733961" lastFinishedPulling="2025-05-16 16:42:52.69459654 +0000 UTC m=+45.244848066" observedRunningTime="2025-05-16 16:42:52.872884638 +0000 UTC m=+45.423136173" watchObservedRunningTime="2025-05-16 16:42:52.87335619 +0000 UTC m=+45.423607719" May 16 16:42:53.854055 kubelet[2904]: I0516 16:42:53.854022 2904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 16 16:42:57.539505 containerd[1610]: time="2025-05-16T16:42:57.539114906Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:57.640717 containerd[1610]: time="2025-05-16T16:42:57.640683913Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 16 16:42:57.666565 containerd[1610]: time="2025-05-16T16:42:57.665804849Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:57.667084 containerd[1610]: time="2025-05-16T16:42:57.667058788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:57.667427 containerd[1610]: time="2025-05-16T16:42:57.667411054Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 4.972745706s" May 16 16:42:57.667457 containerd[1610]: time="2025-05-16T16:42:57.667429995Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 16 16:42:57.687918 containerd[1610]: time="2025-05-16T16:42:57.687873109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 16:42:57.830801 containerd[1610]: time="2025-05-16T16:42:57.830664890Z" level=info msg="CreateContainer within sandbox \"96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 16 16:42:57.889932 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3818659975.mount: Deactivated successfully. May 16 16:42:57.908507 containerd[1610]: time="2025-05-16T16:42:57.890061210Z" level=info msg="Container c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:58.041369 containerd[1610]: time="2025-05-16T16:42:58.041106360Z" level=info msg="CreateContainer within sandbox \"96c3dfda05304783adfb2247102b681aae741dab28edef88f3a783dfb1bd0ec9\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362\"" May 16 16:42:58.092847 containerd[1610]: time="2025-05-16T16:42:58.092473391Z" level=info msg="StartContainer for \"c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362\"" May 16 16:42:58.095403 containerd[1610]: time="2025-05-16T16:42:58.095338666Z" level=info msg="connecting to shim c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362" address="unix:///run/containerd/s/ea0095ed3a9670b113d8f68037a50109ad3b1d00d5999a8a72299004823ddbf7" protocol=ttrpc version=3 May 16 16:42:58.105198 containerd[1610]: time="2025-05-16T16:42:58.105055031Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:42:58.105528 containerd[1610]: time="2025-05-16T16:42:58.105512558Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 16:42:58.126353 containerd[1610]: time="2025-05-16T16:42:58.118513245Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:42:58.134001 kubelet[2904]: E0516 16:42:58.133812 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:42:58.134001 kubelet[2904]: E0516 16:42:58.133862 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:42:58.135370 containerd[1610]: time="2025-05-16T16:42:58.135338486Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 16 16:42:58.143345 kubelet[2904]: E0516 16:42:58.143301 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-46x4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-7x2n4_calico-system(f47c5b55-7da4-4cde-9213-a21e48a7736b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:42:58.147059 kubelet[2904]: E0516 16:42:58.147026 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:42:58.221881 systemd[1]: Started cri-containerd-c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362.scope - libcontainer container c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362. May 16 16:42:58.296047 containerd[1610]: time="2025-05-16T16:42:58.296014772Z" level=info msg="StartContainer for \"c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362\" returns successfully" May 16 16:42:58.913791 kubelet[2904]: E0516 16:42:58.913318 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:42:59.017610 kubelet[2904]: I0516 16:42:58.993428 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-68574cbdb7-zks72" podStartSLOduration=27.750567404999998 podStartE2EDuration="35.993410483s" podCreationTimestamp="2025-05-16 16:42:23 +0000 UTC" firstStartedPulling="2025-05-16 16:42:49.444954275 +0000 UTC m=+41.995205801" lastFinishedPulling="2025-05-16 16:42:57.687797351 +0000 UTC m=+50.238048879" observedRunningTime="2025-05-16 16:42:58.932672826 +0000 UTC m=+51.482924361" watchObservedRunningTime="2025-05-16 16:42:58.993410483 +0000 UTC m=+51.543662017" May 16 16:42:59.023773 containerd[1610]: time="2025-05-16T16:42:59.023003643Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362\" id:\"e07210efae8faf863ccb02edffb9825915b556abf8b966e41d3774acf2d5d162\" pid:5221 exited_at:{seconds:1747413779 nanos:5159759}" May 16 16:43:01.046166 containerd[1610]: time="2025-05-16T16:43:01.046052608Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:01.046810 containerd[1610]: time="2025-05-16T16:43:01.046577858Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 16 16:43:01.047645 containerd[1610]: time="2025-05-16T16:43:01.047025320Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:01.047964 containerd[1610]: time="2025-05-16T16:43:01.047948634Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:01.048365 containerd[1610]: time="2025-05-16T16:43:01.048350038Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 2.912783414s" May 16 16:43:01.048418 containerd[1610]: time="2025-05-16T16:43:01.048410072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 16 16:43:01.049478 containerd[1610]: time="2025-05-16T16:43:01.049463643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 16:43:01.052603 containerd[1610]: time="2025-05-16T16:43:01.050534433Z" level=info msg="CreateContainer within sandbox \"562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 16 16:43:01.057745 containerd[1610]: time="2025-05-16T16:43:01.056316788Z" level=info msg="Container 77c5839528e19d40f024f55f14208ad256233d1e8928adc863a03491c80421e7: CDI devices from CRI Config.CDIDevices: []" May 16 16:43:01.060676 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3122391362.mount: Deactivated successfully. May 16 16:43:01.063383 containerd[1610]: time="2025-05-16T16:43:01.063358037Z" level=info msg="CreateContainer within sandbox \"562d11c32cadfa3affaf6261400751ceb68ba78eedd72e4d73c6bd786a7f60a5\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"77c5839528e19d40f024f55f14208ad256233d1e8928adc863a03491c80421e7\"" May 16 16:43:01.064571 containerd[1610]: time="2025-05-16T16:43:01.063815791Z" level=info msg="StartContainer for \"77c5839528e19d40f024f55f14208ad256233d1e8928adc863a03491c80421e7\"" May 16 16:43:01.064815 containerd[1610]: time="2025-05-16T16:43:01.064799911Z" level=info msg="connecting to shim 77c5839528e19d40f024f55f14208ad256233d1e8928adc863a03491c80421e7" address="unix:///run/containerd/s/df37aeac530c807308efd65ce3cb6e1805431f65038e51f6426178e2a0b74d67" protocol=ttrpc version=3 May 16 16:43:01.088876 systemd[1]: Started cri-containerd-77c5839528e19d40f024f55f14208ad256233d1e8928adc863a03491c80421e7.scope - libcontainer container 77c5839528e19d40f024f55f14208ad256233d1e8928adc863a03491c80421e7. May 16 16:43:01.116411 containerd[1610]: time="2025-05-16T16:43:01.116385643Z" level=info msg="StartContainer for \"77c5839528e19d40f024f55f14208ad256233d1e8928adc863a03491c80421e7\" returns successfully" May 16 16:43:01.288698 containerd[1610]: time="2025-05-16T16:43:01.288663269Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:01.289030 containerd[1610]: time="2025-05-16T16:43:01.289000834Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:01.289103 containerd[1610]: time="2025-05-16T16:43:01.289071124Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 16:43:01.300373 kubelet[2904]: E0516 16:43:01.300201 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:43:01.302312 kubelet[2904]: E0516 16:43:01.302260 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:43:01.302912 kubelet[2904]: E0516 16:43:01.302359 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c110a762b15a493fa59124ec728c2d93,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82ddw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54f476585f-nqgnj_calico-system(3f14b109-fa73-4313-9c07-3d3c314a58ba): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:01.304931 containerd[1610]: time="2025-05-16T16:43:01.304907910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 16:43:01.599494 containerd[1610]: time="2025-05-16T16:43:01.599414843Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:01.599939 containerd[1610]: time="2025-05-16T16:43:01.599892015Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:01.600042 containerd[1610]: time="2025-05-16T16:43:01.599956164Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 16:43:01.600071 kubelet[2904]: E0516 16:43:01.600043 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:43:01.600403 kubelet[2904]: E0516 16:43:01.600072 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:43:01.600403 kubelet[2904]: E0516 16:43:01.600141 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82ddw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54f476585f-nqgnj_calico-system(3f14b109-fa73-4313-9c07-3d3c314a58ba): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:01.601428 kubelet[2904]: E0516 16:43:01.601395 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:43:02.182595 kubelet[2904]: I0516 16:43:02.182244 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-czjpr" podStartSLOduration=27.538872156 podStartE2EDuration="39.182230382s" podCreationTimestamp="2025-05-16 16:42:23 +0000 UTC" firstStartedPulling="2025-05-16 16:42:49.405787115 +0000 UTC m=+41.956038642" lastFinishedPulling="2025-05-16 16:43:01.049145342 +0000 UTC m=+53.599396868" observedRunningTime="2025-05-16 16:43:02.158459682 +0000 UTC m=+54.708711209" watchObservedRunningTime="2025-05-16 16:43:02.182230382 +0000 UTC m=+54.732481920" May 16 16:43:02.874900 kubelet[2904]: I0516 16:43:02.858003 2904 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 16 16:43:02.966876 kubelet[2904]: I0516 16:43:02.966843 2904 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 16 16:43:09.279684 containerd[1610]: time="2025-05-16T16:43:09.279651755Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362\" id:\"2fd310b4cded6e0021e06eaf5c46c8b6beae7bdea746d9d2a38d61d2c17750cd\" pid:5291 exited_at:{seconds:1747413789 nanos:279391219}" May 16 16:43:09.784663 kubelet[2904]: I0516 16:43:09.784636 2904 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 16 16:43:09.852875 containerd[1610]: time="2025-05-16T16:43:09.852789599Z" level=info msg="StopContainer for \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" with timeout 30 (s)" May 16 16:43:09.859370 containerd[1610]: time="2025-05-16T16:43:09.859337808Z" level=info msg="Stop container \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" with signal terminated" May 16 16:43:09.879140 systemd[1]: cri-containerd-65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac.scope: Deactivated successfully. May 16 16:43:09.888267 containerd[1610]: time="2025-05-16T16:43:09.881782945Z" level=info msg="received exit event container_id:\"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" id:\"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" pid:4984 exit_status:1 exited_at:{seconds:1747413789 nanos:881596413}" May 16 16:43:09.888267 containerd[1610]: time="2025-05-16T16:43:09.881863834Z" level=info msg="TaskExit event in podsandbox handler container_id:\"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" id:\"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" pid:4984 exit_status:1 exited_at:{seconds:1747413789 nanos:881596413}" May 16 16:43:09.944261 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac-rootfs.mount: Deactivated successfully. May 16 16:43:10.071335 containerd[1610]: time="2025-05-16T16:43:10.071260001Z" level=info msg="StopContainer for \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" returns successfully" May 16 16:43:10.080856 containerd[1610]: time="2025-05-16T16:43:10.080833855Z" level=info msg="StopPodSandbox for \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\"" May 16 16:43:10.087182 containerd[1610]: time="2025-05-16T16:43:10.087138247Z" level=info msg="Container to stop \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 16 16:43:10.092823 systemd[1]: cri-containerd-ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4.scope: Deactivated successfully. May 16 16:43:10.094124 containerd[1610]: time="2025-05-16T16:43:10.093567461Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" id:\"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" pid:4555 exit_status:137 exited_at:{seconds:1747413790 nanos:93239305}" May 16 16:43:10.115689 systemd[1]: Created slice kubepods-besteffort-pod4a4e5916_b055_4956_bf19_3e003b4aac4a.slice - libcontainer container kubepods-besteffort-pod4a4e5916_b055_4956_bf19_3e003b4aac4a.slice. May 16 16:43:10.124146 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4-rootfs.mount: Deactivated successfully. May 16 16:43:10.135510 containerd[1610]: time="2025-05-16T16:43:10.135454580Z" level=info msg="shim disconnected" id=ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4 namespace=k8s.io May 16 16:43:10.136051 containerd[1610]: time="2025-05-16T16:43:10.135892110Z" level=warning msg="cleaning up after shim disconnected" id=ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4 namespace=k8s.io May 16 16:43:10.136051 containerd[1610]: time="2025-05-16T16:43:10.135900605Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 16 16:43:10.174151 containerd[1610]: time="2025-05-16T16:43:10.173782371Z" level=info msg="received exit event sandbox_id:\"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" exit_status:137 exited_at:{seconds:1747413790 nanos:93239305}" May 16 16:43:10.176538 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4-shm.mount: Deactivated successfully. May 16 16:43:10.230753 kubelet[2904]: I0516 16:43:10.230693 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57cgl\" (UniqueName: \"kubernetes.io/projected/4a4e5916-b055-4956-bf19-3e003b4aac4a-kube-api-access-57cgl\") pod \"calico-apiserver-567dc9f47-mj2sv\" (UID: \"4a4e5916-b055-4956-bf19-3e003b4aac4a\") " pod="calico-apiserver/calico-apiserver-567dc9f47-mj2sv" May 16 16:43:10.230753 kubelet[2904]: I0516 16:43:10.230745 2904 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4a4e5916-b055-4956-bf19-3e003b4aac4a-calico-apiserver-certs\") pod \"calico-apiserver-567dc9f47-mj2sv\" (UID: \"4a4e5916-b055-4956-bf19-3e003b4aac4a\") " pod="calico-apiserver/calico-apiserver-567dc9f47-mj2sv" May 16 16:43:10.419440 containerd[1610]: time="2025-05-16T16:43:10.419369712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567dc9f47-mj2sv,Uid:4a4e5916-b055-4956-bf19-3e003b4aac4a,Namespace:calico-apiserver,Attempt:0,}" May 16 16:43:10.542103 systemd-networkd[1541]: calie63306bbb26: Link DOWN May 16 16:43:10.542109 systemd-networkd[1541]: calie63306bbb26: Lost carrier May 16 16:43:10.853997 systemd-networkd[1541]: cali469ded6bb9a: Link UP May 16 16:43:10.855931 systemd-networkd[1541]: cali469ded6bb9a: Gained carrier May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.533 [INFO][5418] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0 calico-apiserver-567dc9f47- calico-apiserver 4a4e5916-b055-4956-bf19-3e003b4aac4a 1143 0 2025-05-16 16:43:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:567dc9f47 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-567dc9f47-mj2sv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali469ded6bb9a [] [] }} ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-mj2sv" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.536 [INFO][5418] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-mj2sv" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.804 [INFO][5434] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" HandleID="k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Workload="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.807 [INFO][5434] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" HandleID="k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Workload="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030c420), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-567dc9f47-mj2sv", "timestamp":"2025-05-16 16:43:10.804967451 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.807 [INFO][5434] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.807 [INFO][5434] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.807 [INFO][5434] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.818 [INFO][5434] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.833 [INFO][5434] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.836 [INFO][5434] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.837 [INFO][5434] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.838 [INFO][5434] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.838 [INFO][5434] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.839 [INFO][5434] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2 May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.841 [INFO][5434] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.846 [INFO][5434] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.138/26] block=192.168.88.128/26 handle="k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.846 [INFO][5434] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.138/26] handle="k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" host="localhost" May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.846 [INFO][5434] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:43:10.869107 containerd[1610]: 2025-05-16 16:43:10.846 [INFO][5434] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.138/26] IPv6=[] ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" HandleID="k8s-pod-network.d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Workload="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" May 16 16:43:10.876541 containerd[1610]: 2025-05-16 16:43:10.849 [INFO][5418] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-mj2sv" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0", GenerateName:"calico-apiserver-567dc9f47-", Namespace:"calico-apiserver", SelfLink:"", UID:"4a4e5916-b055-4956-bf19-3e003b4aac4a", ResourceVersion:"1143", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 43, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"567dc9f47", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-567dc9f47-mj2sv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali469ded6bb9a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:43:10.876541 containerd[1610]: 2025-05-16 16:43:10.850 [INFO][5418] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.138/32] ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-mj2sv" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" May 16 16:43:10.876541 containerd[1610]: 2025-05-16 16:43:10.850 [INFO][5418] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali469ded6bb9a ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-mj2sv" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" May 16 16:43:10.876541 containerd[1610]: 2025-05-16 16:43:10.856 [INFO][5418] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-mj2sv" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" May 16 16:43:10.876541 containerd[1610]: 2025-05-16 16:43:10.857 [INFO][5418] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-mj2sv" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0", GenerateName:"calico-apiserver-567dc9f47-", Namespace:"calico-apiserver", SelfLink:"", UID:"4a4e5916-b055-4956-bf19-3e003b4aac4a", ResourceVersion:"1143", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 43, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"567dc9f47", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2", Pod:"calico-apiserver-567dc9f47-mj2sv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali469ded6bb9a", MAC:"ba:e6:a4:1c:7c:26", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:43:10.876541 containerd[1610]: 2025-05-16 16:43:10.865 [INFO][5418] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" Namespace="calico-apiserver" Pod="calico-apiserver-567dc9f47-mj2sv" WorkloadEndpoint="localhost-k8s-calico--apiserver--567dc9f47--mj2sv-eth0" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.538 [INFO][5411] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.538 [INFO][5411] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" iface="eth0" netns="/var/run/netns/cni-6c343077-aeb2-2909-dce8-ced3ca3a82a8" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.539 [INFO][5411] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" iface="eth0" netns="/var/run/netns/cni-6c343077-aeb2-2909-dce8-ced3ca3a82a8" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.546 [INFO][5411] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" after=6.986934ms iface="eth0" netns="/var/run/netns/cni-6c343077-aeb2-2909-dce8-ced3ca3a82a8" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.546 [INFO][5411] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.546 [INFO][5411] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.805 [INFO][5432] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.807 [INFO][5432] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.846 [INFO][5432] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.935 [INFO][5432] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.935 [INFO][5432] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.937 [INFO][5432] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:43:10.950194 containerd[1610]: 2025-05-16 16:43:10.940 [INFO][5411] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:43:10.953166 containerd[1610]: time="2025-05-16T16:43:10.952944097Z" level=info msg="TearDown network for sandbox \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" successfully" May 16 16:43:10.954278 containerd[1610]: time="2025-05-16T16:43:10.954051527Z" level=info msg="StopPodSandbox for \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" returns successfully" May 16 16:43:10.955945 systemd[1]: run-netns-cni\x2d6c343077\x2daeb2\x2d2909\x2ddce8\x2dced3ca3a82a8.mount: Deactivated successfully. May 16 16:43:10.962738 containerd[1610]: time="2025-05-16T16:43:10.962527524Z" level=info msg="connecting to shim d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2" address="unix:///run/containerd/s/946bf773dd70251ad965fae1a0b12a681a72f1266c63467dc74cd36f11f67f2f" namespace=k8s.io protocol=ttrpc version=3 May 16 16:43:10.985971 systemd[1]: Started cri-containerd-d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2.scope - libcontainer container d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2. May 16 16:43:10.997551 systemd-resolved[1492]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:43:11.043133 containerd[1610]: time="2025-05-16T16:43:11.043080371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567dc9f47-mj2sv,Uid:4a4e5916-b055-4956-bf19-3e003b4aac4a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2\"" May 16 16:43:11.057502 containerd[1610]: time="2025-05-16T16:43:11.057298894Z" level=info msg="CreateContainer within sandbox \"d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 16 16:43:11.090738 containerd[1610]: time="2025-05-16T16:43:11.088935704Z" level=info msg="Container fa42253526d1778d4a8154bbd10df73adc09e4420260156c616cf675cfe94eb1: CDI devices from CRI Config.CDIDevices: []" May 16 16:43:11.090373 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount358996417.mount: Deactivated successfully. May 16 16:43:11.094172 kubelet[2904]: I0516 16:43:11.094154 2904 scope.go:117] "RemoveContainer" containerID="65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac" May 16 16:43:11.099747 containerd[1610]: time="2025-05-16T16:43:11.099563022Z" level=info msg="CreateContainer within sandbox \"d560b69312f65d668b4e7fe9a24455c7f78ea0181b505d4ed5a03b8766ac31c2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fa42253526d1778d4a8154bbd10df73adc09e4420260156c616cf675cfe94eb1\"" May 16 16:43:11.103738 containerd[1610]: time="2025-05-16T16:43:11.103309856Z" level=info msg="StartContainer for \"fa42253526d1778d4a8154bbd10df73adc09e4420260156c616cf675cfe94eb1\"" May 16 16:43:11.104332 containerd[1610]: time="2025-05-16T16:43:11.104319943Z" level=info msg="connecting to shim fa42253526d1778d4a8154bbd10df73adc09e4420260156c616cf675cfe94eb1" address="unix:///run/containerd/s/946bf773dd70251ad965fae1a0b12a681a72f1266c63467dc74cd36f11f67f2f" protocol=ttrpc version=3 May 16 16:43:11.105017 containerd[1610]: time="2025-05-16T16:43:11.104699077Z" level=info msg="RemoveContainer for \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\"" May 16 16:43:11.115189 containerd[1610]: time="2025-05-16T16:43:11.115168885Z" level=info msg="RemoveContainer for \"65c8c40a5b0e0c7ff2ef797135b4acfe91c404356439125910f548626d63a5ac\" returns successfully" May 16 16:43:11.121836 systemd[1]: Started cri-containerd-fa42253526d1778d4a8154bbd10df73adc09e4420260156c616cf675cfe94eb1.scope - libcontainer container fa42253526d1778d4a8154bbd10df73adc09e4420260156c616cf675cfe94eb1. May 16 16:43:11.171569 containerd[1610]: time="2025-05-16T16:43:11.171521085Z" level=info msg="StartContainer for \"fa42253526d1778d4a8154bbd10df73adc09e4420260156c616cf675cfe94eb1\" returns successfully" May 16 16:43:11.224004 kubelet[2904]: I0516 16:43:11.223974 2904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhtvc\" (UniqueName: \"kubernetes.io/projected/1378257a-134f-443c-9608-a7936ba8cf76-kube-api-access-jhtvc\") pod \"1378257a-134f-443c-9608-a7936ba8cf76\" (UID: \"1378257a-134f-443c-9608-a7936ba8cf76\") " May 16 16:43:11.224004 kubelet[2904]: I0516 16:43:11.224012 2904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1378257a-134f-443c-9608-a7936ba8cf76-calico-apiserver-certs\") pod \"1378257a-134f-443c-9608-a7936ba8cf76\" (UID: \"1378257a-134f-443c-9608-a7936ba8cf76\") " May 16 16:43:11.253957 kubelet[2904]: I0516 16:43:11.253922 2904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1378257a-134f-443c-9608-a7936ba8cf76-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "1378257a-134f-443c-9608-a7936ba8cf76" (UID: "1378257a-134f-443c-9608-a7936ba8cf76"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 16 16:43:11.254761 kubelet[2904]: I0516 16:43:11.254742 2904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1378257a-134f-443c-9608-a7936ba8cf76-kube-api-access-jhtvc" (OuterVolumeSpecName: "kube-api-access-jhtvc") pod "1378257a-134f-443c-9608-a7936ba8cf76" (UID: "1378257a-134f-443c-9608-a7936ba8cf76"). InnerVolumeSpecName "kube-api-access-jhtvc". PluginName "kubernetes.io/projected", VolumeGidValue "" May 16 16:43:11.385304 kubelet[2904]: I0516 16:43:11.385218 2904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhtvc\" (UniqueName: \"kubernetes.io/projected/1378257a-134f-443c-9608-a7936ba8cf76-kube-api-access-jhtvc\") on node \"localhost\" DevicePath \"\"" May 16 16:43:11.385304 kubelet[2904]: I0516 16:43:11.385245 2904 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1378257a-134f-443c-9608-a7936ba8cf76-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" May 16 16:43:11.434040 systemd[1]: Removed slice kubepods-besteffort-pod1378257a_134f_443c_9608_a7936ba8cf76.slice - libcontainer container kubepods-besteffort-pod1378257a_134f_443c_9608_a7936ba8cf76.slice. May 16 16:43:11.593854 kubelet[2904]: I0516 16:43:11.593499 2904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1378257a-134f-443c-9608-a7936ba8cf76" path="/var/lib/kubelet/pods/1378257a-134f-443c-9608-a7936ba8cf76/volumes" May 16 16:43:11.880024 containerd[1610]: time="2025-05-16T16:43:11.879991774Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853\" id:\"dd65d5cc96548cc9cabdbd3a5f60628b0584417b0389437bda12c6575eccb27c\" pid:5560 exited_at:{seconds:1747413791 nanos:879658310}" May 16 16:43:11.945722 systemd[1]: var-lib-kubelet-pods-1378257a\x2d134f\x2d443c\x2d9608\x2da7936ba8cf76-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2djhtvc.mount: Deactivated successfully. May 16 16:43:11.947116 systemd[1]: var-lib-kubelet-pods-1378257a\x2d134f\x2d443c\x2d9608\x2da7936ba8cf76-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 16 16:43:12.204839 systemd-networkd[1541]: cali469ded6bb9a: Gained IPv6LL May 16 16:43:12.278316 kubelet[2904]: I0516 16:43:12.245354 2904 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-567dc9f47-mj2sv" podStartSLOduration=3.242689379 podStartE2EDuration="3.242689379s" podCreationTimestamp="2025-05-16 16:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:43:12.194891634 +0000 UTC m=+64.745143169" watchObservedRunningTime="2025-05-16 16:43:12.242689379 +0000 UTC m=+64.792940916" May 16 16:43:12.340637 containerd[1610]: time="2025-05-16T16:43:12.340613721Z" level=info msg="StopContainer for \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" with timeout 30 (s)" May 16 16:43:12.343201 containerd[1610]: time="2025-05-16T16:43:12.343088758Z" level=info msg="Stop container \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" with signal terminated" May 16 16:43:12.376269 systemd[1]: cri-containerd-a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab.scope: Deactivated successfully. May 16 16:43:12.378959 containerd[1610]: time="2025-05-16T16:43:12.378928412Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" id:\"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" pid:4938 exit_status:1 exited_at:{seconds:1747413792 nanos:378556184}" May 16 16:43:12.390838 containerd[1610]: time="2025-05-16T16:43:12.390798644Z" level=info msg="received exit event container_id:\"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" id:\"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" pid:4938 exit_status:1 exited_at:{seconds:1747413792 nanos:378556184}" May 16 16:43:12.407631 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab-rootfs.mount: Deactivated successfully. May 16 16:43:12.412631 containerd[1610]: time="2025-05-16T16:43:12.412569863Z" level=info msg="StopContainer for \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" returns successfully" May 16 16:43:12.413044 containerd[1610]: time="2025-05-16T16:43:12.412977061Z" level=info msg="StopPodSandbox for \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\"" May 16 16:43:12.417399 containerd[1610]: time="2025-05-16T16:43:12.417386652Z" level=info msg="Container to stop \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 16 16:43:12.421296 systemd[1]: cri-containerd-005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac.scope: Deactivated successfully. May 16 16:43:12.427308 containerd[1610]: time="2025-05-16T16:43:12.427251370Z" level=info msg="TaskExit event in podsandbox handler container_id:\"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" id:\"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" pid:4379 exit_status:137 exited_at:{seconds:1747413792 nanos:427070806}" May 16 16:43:12.443886 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac-rootfs.mount: Deactivated successfully. May 16 16:43:12.446298 containerd[1610]: time="2025-05-16T16:43:12.446276588Z" level=info msg="shim disconnected" id=005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac namespace=k8s.io May 16 16:43:12.446298 containerd[1610]: time="2025-05-16T16:43:12.446295938Z" level=warning msg="cleaning up after shim disconnected" id=005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac namespace=k8s.io May 16 16:43:12.455955 containerd[1610]: time="2025-05-16T16:43:12.446300608Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 16 16:43:12.472836 containerd[1610]: time="2025-05-16T16:43:12.472757845Z" level=info msg="received exit event sandbox_id:\"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" exit_status:137 exited_at:{seconds:1747413792 nanos:427070806}" May 16 16:43:12.474154 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac-shm.mount: Deactivated successfully. May 16 16:43:12.519884 systemd-networkd[1541]: cali96b73110625: Link DOWN May 16 16:43:12.519982 systemd-networkd[1541]: cali96b73110625: Lost carrier May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.518 [INFO][5649] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.518 [INFO][5649] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" iface="eth0" netns="/var/run/netns/cni-cd719121-dc09-6de5-35e2-2eba0f576fd0" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.519 [INFO][5649] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" iface="eth0" netns="/var/run/netns/cni-cd719121-dc09-6de5-35e2-2eba0f576fd0" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.525 [INFO][5649] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" after=6.851327ms iface="eth0" netns="/var/run/netns/cni-cd719121-dc09-6de5-35e2-2eba0f576fd0" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.525 [INFO][5649] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.525 [INFO][5649] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.546 [INFO][5658] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.547 [INFO][5658] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.547 [INFO][5658] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.571 [INFO][5658] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.571 [INFO][5658] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.572 [INFO][5658] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:43:12.577167 containerd[1610]: 2025-05-16 16:43:12.574 [INFO][5649] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:43:12.579959 containerd[1610]: time="2025-05-16T16:43:12.579912327Z" level=info msg="TearDown network for sandbox \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" successfully" May 16 16:43:12.579959 containerd[1610]: time="2025-05-16T16:43:12.579930419Z" level=info msg="StopPodSandbox for \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" returns successfully" May 16 16:43:12.579554 systemd[1]: run-netns-cni\x2dcd719121\x2ddc09\x2d6de5\x2d35e2\x2d2eba0f576fd0.mount: Deactivated successfully. May 16 16:43:12.622153 kubelet[2904]: E0516 16:43:12.622119 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:43:12.759269 kubelet[2904]: I0516 16:43:12.759199 2904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03784e1b-ebb0-4a4e-8922-7c6e649932aa-calico-apiserver-certs\") pod \"03784e1b-ebb0-4a4e-8922-7c6e649932aa\" (UID: \"03784e1b-ebb0-4a4e-8922-7c6e649932aa\") " May 16 16:43:12.759269 kubelet[2904]: I0516 16:43:12.759231 2904 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b42cr\" (UniqueName: \"kubernetes.io/projected/03784e1b-ebb0-4a4e-8922-7c6e649932aa-kube-api-access-b42cr\") pod \"03784e1b-ebb0-4a4e-8922-7c6e649932aa\" (UID: \"03784e1b-ebb0-4a4e-8922-7c6e649932aa\") " May 16 16:43:12.767886 kubelet[2904]: I0516 16:43:12.767860 2904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03784e1b-ebb0-4a4e-8922-7c6e649932aa-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "03784e1b-ebb0-4a4e-8922-7c6e649932aa" (UID: "03784e1b-ebb0-4a4e-8922-7c6e649932aa"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 16 16:43:12.768705 kubelet[2904]: I0516 16:43:12.768688 2904 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03784e1b-ebb0-4a4e-8922-7c6e649932aa-kube-api-access-b42cr" (OuterVolumeSpecName: "kube-api-access-b42cr") pod "03784e1b-ebb0-4a4e-8922-7c6e649932aa" (UID: "03784e1b-ebb0-4a4e-8922-7c6e649932aa"). InnerVolumeSpecName "kube-api-access-b42cr". PluginName "kubernetes.io/projected", VolumeGidValue "" May 16 16:43:12.860226 kubelet[2904]: I0516 16:43:12.860200 2904 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03784e1b-ebb0-4a4e-8922-7c6e649932aa-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" May 16 16:43:12.860226 kubelet[2904]: I0516 16:43:12.860222 2904 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b42cr\" (UniqueName: \"kubernetes.io/projected/03784e1b-ebb0-4a4e-8922-7c6e649932aa-kube-api-access-b42cr\") on node \"localhost\" DevicePath \"\"" May 16 16:43:12.943248 systemd[1]: var-lib-kubelet-pods-03784e1b\x2debb0\x2d4a4e\x2d8922\x2d7c6e649932aa-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2db42cr.mount: Deactivated successfully. May 16 16:43:12.943353 systemd[1]: var-lib-kubelet-pods-03784e1b\x2debb0\x2d4a4e\x2d8922\x2d7c6e649932aa-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 16 16:43:13.234037 kubelet[2904]: I0516 16:43:13.233948 2904 scope.go:117] "RemoveContainer" containerID="a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab" May 16 16:43:13.235478 containerd[1610]: time="2025-05-16T16:43:13.235127036Z" level=info msg="RemoveContainer for \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\"" May 16 16:43:13.249680 systemd[1]: Removed slice kubepods-besteffort-pod03784e1b_ebb0_4a4e_8922_7c6e649932aa.slice - libcontainer container kubepods-besteffort-pod03784e1b_ebb0_4a4e_8922_7c6e649932aa.slice. May 16 16:43:13.249762 systemd[1]: kubepods-besteffort-pod03784e1b_ebb0_4a4e_8922_7c6e649932aa.slice: Consumed 1.016s CPU time, 47M memory peak, 12K read from disk. May 16 16:43:13.276637 containerd[1610]: time="2025-05-16T16:43:13.250504664Z" level=info msg="RemoveContainer for \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" returns successfully" May 16 16:43:13.276637 containerd[1610]: time="2025-05-16T16:43:13.250846747Z" level=error msg="ContainerStatus for \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\": not found" May 16 16:43:13.276709 kubelet[2904]: I0516 16:43:13.250682 2904 scope.go:117] "RemoveContainer" containerID="a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab" May 16 16:43:13.283779 kubelet[2904]: E0516 16:43:13.276946 2904 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\": not found" containerID="a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab" May 16 16:43:13.373282 kubelet[2904]: I0516 16:43:13.277013 2904 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab"} err="failed to get container status \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\": rpc error: code = NotFound desc = an error occurred when try to find container \"a4d5648efad537e81a4d602cd6fd4483ee1e3a4c4de5a52ebeb210502915dcab\": not found" May 16 16:43:13.579038 kubelet[2904]: I0516 16:43:13.578968 2904 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03784e1b-ebb0-4a4e-8922-7c6e649932aa" path="/var/lib/kubelet/pods/03784e1b-ebb0-4a4e-8922-7c6e649932aa/volumes" May 16 16:43:13.587914 containerd[1610]: time="2025-05-16T16:43:13.587881006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 16:43:13.854216 containerd[1610]: time="2025-05-16T16:43:13.854101579Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:13.859155 containerd[1610]: time="2025-05-16T16:43:13.858678583Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 16:43:13.863851 containerd[1610]: time="2025-05-16T16:43:13.863814384Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:13.864138 kubelet[2904]: E0516 16:43:13.864108 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:13.870062 kubelet[2904]: E0516 16:43:13.869958 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:13.956155 kubelet[2904]: E0516 16:43:13.956094 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-46x4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-7x2n4_calico-system(f47c5b55-7da4-4cde-9213-a21e48a7736b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:13.957273 kubelet[2904]: E0516 16:43:13.957249 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:43:22.042675 systemd[1]: Started sshd@7-139.178.70.106:22-147.75.109.163:56050.service - OpenSSH per-connection server daemon (147.75.109.163:56050). May 16 16:43:22.193405 sshd[5681]: Accepted publickey for core from 147.75.109.163 port 56050 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:22.197052 sshd-session[5681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:22.206465 systemd-logind[1588]: New session 10 of user core. May 16 16:43:22.210849 systemd[1]: Started session-10.scope - Session 10 of User core. May 16 16:43:22.727428 sshd[5683]: Connection closed by 147.75.109.163 port 56050 May 16 16:43:22.727903 sshd-session[5681]: pam_unix(sshd:session): session closed for user core May 16 16:43:22.732024 systemd-logind[1588]: Session 10 logged out. Waiting for processes to exit. May 16 16:43:22.732089 systemd[1]: sshd@7-139.178.70.106:22-147.75.109.163:56050.service: Deactivated successfully. May 16 16:43:22.733366 systemd[1]: session-10.scope: Deactivated successfully. May 16 16:43:22.734633 systemd-logind[1588]: Removed session 10. May 16 16:43:26.975975 containerd[1610]: time="2025-05-16T16:43:26.975882748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 16:43:27.272345 containerd[1610]: time="2025-05-16T16:43:27.272254304Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:27.274434 containerd[1610]: time="2025-05-16T16:43:27.274395546Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:27.286117 containerd[1610]: time="2025-05-16T16:43:27.274455903Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 16:43:27.290797 kubelet[2904]: E0516 16:43:27.285492 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:43:27.290797 kubelet[2904]: E0516 16:43:27.290665 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:43:27.295754 kubelet[2904]: E0516 16:43:27.295689 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c110a762b15a493fa59124ec728c2d93,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82ddw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54f476585f-nqgnj_calico-system(3f14b109-fa73-4313-9c07-3d3c314a58ba): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:27.297380 containerd[1610]: time="2025-05-16T16:43:27.297351324Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 16:43:27.534976 containerd[1610]: time="2025-05-16T16:43:27.534889107Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:27.539148 containerd[1610]: time="2025-05-16T16:43:27.539094055Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:27.548120 containerd[1610]: time="2025-05-16T16:43:27.539165315Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 16:43:27.548181 kubelet[2904]: E0516 16:43:27.539428 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:43:27.548181 kubelet[2904]: E0516 16:43:27.539463 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:43:27.548181 kubelet[2904]: E0516 16:43:27.539545 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82ddw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54f476585f-nqgnj_calico-system(3f14b109-fa73-4313-9c07-3d3c314a58ba): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:27.615755 kubelet[2904]: E0516 16:43:27.568320 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:43:27.657904 kubelet[2904]: E0516 16:43:27.657468 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:43:27.740748 systemd[1]: Started sshd@8-139.178.70.106:22-147.75.109.163:56052.service - OpenSSH per-connection server daemon (147.75.109.163:56052). May 16 16:43:28.041362 sshd[5708]: Accepted publickey for core from 147.75.109.163 port 56052 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:28.041270 sshd-session[5708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:28.047329 systemd-logind[1588]: New session 11 of user core. May 16 16:43:28.052879 systemd[1]: Started session-11.scope - Session 11 of User core. May 16 16:43:28.700038 sshd[5710]: Connection closed by 147.75.109.163 port 56052 May 16 16:43:28.699695 sshd-session[5708]: pam_unix(sshd:session): session closed for user core May 16 16:43:28.702877 systemd[1]: sshd@8-139.178.70.106:22-147.75.109.163:56052.service: Deactivated successfully. May 16 16:43:28.707041 systemd[1]: session-11.scope: Deactivated successfully. May 16 16:43:28.707949 systemd-logind[1588]: Session 11 logged out. Waiting for processes to exit. May 16 16:43:28.709048 systemd-logind[1588]: Removed session 11. May 16 16:43:33.709271 systemd[1]: Started sshd@9-139.178.70.106:22-147.75.109.163:51876.service - OpenSSH per-connection server daemon (147.75.109.163:51876). May 16 16:43:33.817636 sshd[5726]: Accepted publickey for core from 147.75.109.163 port 51876 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:33.819234 sshd-session[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:33.823411 systemd-logind[1588]: New session 12 of user core. May 16 16:43:33.827872 systemd[1]: Started session-12.scope - Session 12 of User core. May 16 16:43:33.926878 sshd[5728]: Connection closed by 147.75.109.163 port 51876 May 16 16:43:33.927298 sshd-session[5726]: pam_unix(sshd:session): session closed for user core May 16 16:43:33.929768 systemd[1]: sshd@9-139.178.70.106:22-147.75.109.163:51876.service: Deactivated successfully. May 16 16:43:33.931246 systemd[1]: session-12.scope: Deactivated successfully. May 16 16:43:33.931983 systemd-logind[1588]: Session 12 logged out. Waiting for processes to exit. May 16 16:43:33.933396 systemd-logind[1588]: Removed session 12. May 16 16:43:38.938712 systemd[1]: Started sshd@10-139.178.70.106:22-147.75.109.163:44086.service - OpenSSH per-connection server daemon (147.75.109.163:44086). May 16 16:43:38.995555 sshd[5741]: Accepted publickey for core from 147.75.109.163 port 44086 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:38.997087 sshd-session[5741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:39.001850 systemd-logind[1588]: New session 13 of user core. May 16 16:43:39.006862 systemd[1]: Started session-13.scope - Session 13 of User core. May 16 16:43:39.135623 sshd[5743]: Connection closed by 147.75.109.163 port 44086 May 16 16:43:39.143075 systemd[1]: sshd@10-139.178.70.106:22-147.75.109.163:44086.service: Deactivated successfully. May 16 16:43:39.137021 sshd-session[5741]: pam_unix(sshd:session): session closed for user core May 16 16:43:39.144441 systemd[1]: session-13.scope: Deactivated successfully. May 16 16:43:39.145406 systemd-logind[1588]: Session 13 logged out. Waiting for processes to exit. May 16 16:43:39.146716 systemd[1]: Started sshd@11-139.178.70.106:22-147.75.109.163:44088.service - OpenSSH per-connection server daemon (147.75.109.163:44088). May 16 16:43:39.148470 systemd-logind[1588]: Removed session 13. May 16 16:43:39.192406 sshd[5756]: Accepted publickey for core from 147.75.109.163 port 44088 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:39.193528 sshd-session[5756]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:39.196836 systemd-logind[1588]: New session 14 of user core. May 16 16:43:39.206323 systemd[1]: Started session-14.scope - Session 14 of User core. May 16 16:43:39.487340 sshd[5758]: Connection closed by 147.75.109.163 port 44088 May 16 16:43:39.499471 systemd[1]: sshd@11-139.178.70.106:22-147.75.109.163:44088.service: Deactivated successfully. May 16 16:43:39.487790 sshd-session[5756]: pam_unix(sshd:session): session closed for user core May 16 16:43:39.503217 systemd[1]: session-14.scope: Deactivated successfully. May 16 16:43:39.508311 systemd-logind[1588]: Session 14 logged out. Waiting for processes to exit. May 16 16:43:39.510851 systemd[1]: Started sshd@12-139.178.70.106:22-147.75.109.163:44102.service - OpenSSH per-connection server daemon (147.75.109.163:44102). May 16 16:43:39.512875 systemd-logind[1588]: Removed session 14. May 16 16:43:39.570785 sshd[5773]: Accepted publickey for core from 147.75.109.163 port 44102 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:39.572706 sshd-session[5773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:39.579852 systemd-logind[1588]: New session 15 of user core. May 16 16:43:39.586923 systemd[1]: Started session-15.scope - Session 15 of User core. May 16 16:43:39.655570 kubelet[2904]: E0516 16:43:39.655546 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:43:39.725782 sshd[5776]: Connection closed by 147.75.109.163 port 44102 May 16 16:43:39.726038 sshd-session[5773]: pam_unix(sshd:session): session closed for user core May 16 16:43:39.729851 systemd-logind[1588]: Session 15 logged out. Waiting for processes to exit. May 16 16:43:39.729997 systemd[1]: sshd@12-139.178.70.106:22-147.75.109.163:44102.service: Deactivated successfully. May 16 16:43:39.732029 systemd[1]: session-15.scope: Deactivated successfully. May 16 16:43:39.733438 systemd-logind[1588]: Removed session 15. May 16 16:43:39.816867 containerd[1610]: time="2025-05-16T16:43:39.816773462Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362\" id:\"9aadc8d10433b031b2a1cb4687c2c0482166f8f7c0d63f66468188fea5454d70\" pid:5783 exited_at:{seconds:1747413819 nanos:816514689}" May 16 16:43:41.582062 containerd[1610]: time="2025-05-16T16:43:41.581459296Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 16:43:41.835789 containerd[1610]: time="2025-05-16T16:43:41.835622704Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:41.839297 containerd[1610]: time="2025-05-16T16:43:41.839253397Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:41.839365 containerd[1610]: time="2025-05-16T16:43:41.839333346Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 16:43:41.839610 kubelet[2904]: E0516 16:43:41.839512 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:41.839610 kubelet[2904]: E0516 16:43:41.839561 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:41.846059 kubelet[2904]: E0516 16:43:41.840793 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-46x4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-7x2n4_calico-system(f47c5b55-7da4-4cde-9213-a21e48a7736b): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:41.846059 kubelet[2904]: E0516 16:43:41.842135 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:43:42.082758 containerd[1610]: time="2025-05-16T16:43:42.082694897Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853\" id:\"5b470b66882cccca969a500ec715f5eb536eeba53bafe64420ab13a797b17cb0\" pid:5813 exit_status:1 exited_at:{seconds:1747413822 nanos:81851457}" May 16 16:43:44.746295 systemd[1]: Started sshd@13-139.178.70.106:22-147.75.109.163:44114.service - OpenSSH per-connection server daemon (147.75.109.163:44114). May 16 16:43:45.130253 sshd[5834]: Accepted publickey for core from 147.75.109.163 port 44114 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:45.132279 sshd-session[5834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:45.135966 systemd-logind[1588]: New session 16 of user core. May 16 16:43:45.141872 systemd[1]: Started session-16.scope - Session 16 of User core. May 16 16:43:45.630784 sshd[5836]: Connection closed by 147.75.109.163 port 44114 May 16 16:43:45.633166 systemd[1]: sshd@13-139.178.70.106:22-147.75.109.163:44114.service: Deactivated successfully. May 16 16:43:45.631186 sshd-session[5834]: pam_unix(sshd:session): session closed for user core May 16 16:43:45.634562 systemd[1]: session-16.scope: Deactivated successfully. May 16 16:43:45.635311 systemd-logind[1588]: Session 16 logged out. Waiting for processes to exit. May 16 16:43:45.636329 systemd-logind[1588]: Removed session 16. May 16 16:43:50.580470 kubelet[2904]: E0516 16:43:50.579894 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:43:50.641022 systemd[1]: Started sshd@14-139.178.70.106:22-147.75.109.163:41334.service - OpenSSH per-connection server daemon (147.75.109.163:41334). May 16 16:43:50.699766 sshd[5848]: Accepted publickey for core from 147.75.109.163 port 41334 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:50.701032 sshd-session[5848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:50.706526 systemd-logind[1588]: New session 17 of user core. May 16 16:43:50.713915 systemd[1]: Started session-17.scope - Session 17 of User core. May 16 16:43:50.870016 sshd[5850]: Connection closed by 147.75.109.163 port 41334 May 16 16:43:50.870470 sshd-session[5848]: pam_unix(sshd:session): session closed for user core May 16 16:43:50.873015 systemd[1]: sshd@14-139.178.70.106:22-147.75.109.163:41334.service: Deactivated successfully. May 16 16:43:50.874311 systemd[1]: session-17.scope: Deactivated successfully. May 16 16:43:50.874952 systemd-logind[1588]: Session 17 logged out. Waiting for processes to exit. May 16 16:43:50.876097 systemd-logind[1588]: Removed session 17. May 16 16:43:55.880195 systemd[1]: Started sshd@15-139.178.70.106:22-147.75.109.163:41350.service - OpenSSH per-connection server daemon (147.75.109.163:41350). May 16 16:43:55.937561 sshd[5864]: Accepted publickey for core from 147.75.109.163 port 41350 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:43:55.938369 sshd-session[5864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:55.940962 systemd-logind[1588]: New session 18 of user core. May 16 16:43:55.944815 systemd[1]: Started session-18.scope - Session 18 of User core. May 16 16:43:56.131841 sshd[5866]: Connection closed by 147.75.109.163 port 41350 May 16 16:43:56.132291 sshd-session[5864]: pam_unix(sshd:session): session closed for user core May 16 16:43:56.134694 systemd[1]: sshd@15-139.178.70.106:22-147.75.109.163:41350.service: Deactivated successfully. May 16 16:43:56.135893 systemd[1]: session-18.scope: Deactivated successfully. May 16 16:43:56.136487 systemd-logind[1588]: Session 18 logged out. Waiting for processes to exit. May 16 16:43:56.137323 systemd-logind[1588]: Removed session 18. May 16 16:43:56.578470 kubelet[2904]: E0516 16:43:56.578438 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:44:01.142260 systemd[1]: Started sshd@16-139.178.70.106:22-147.75.109.163:42952.service - OpenSSH per-connection server daemon (147.75.109.163:42952). May 16 16:44:01.206425 sshd[5878]: Accepted publickey for core from 147.75.109.163 port 42952 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:44:01.207915 sshd-session[5878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:44:01.211224 systemd-logind[1588]: New session 19 of user core. May 16 16:44:01.217817 systemd[1]: Started session-19.scope - Session 19 of User core. May 16 16:44:01.397144 sshd[5881]: Connection closed by 147.75.109.163 port 42952 May 16 16:44:01.398238 sshd-session[5878]: pam_unix(sshd:session): session closed for user core May 16 16:44:01.407245 systemd[1]: sshd@16-139.178.70.106:22-147.75.109.163:42952.service: Deactivated successfully. May 16 16:44:01.411081 systemd[1]: session-19.scope: Deactivated successfully. May 16 16:44:01.412032 systemd-logind[1588]: Session 19 logged out. Waiting for processes to exit. May 16 16:44:01.414533 systemd-logind[1588]: Removed session 19. May 16 16:44:01.417641 systemd[1]: Started sshd@17-139.178.70.106:22-147.75.109.163:42958.service - OpenSSH per-connection server daemon (147.75.109.163:42958). May 16 16:44:01.457316 sshd[5893]: Accepted publickey for core from 147.75.109.163 port 42958 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:44:01.459392 sshd-session[5893]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:44:01.466005 systemd-logind[1588]: New session 20 of user core. May 16 16:44:01.471852 systemd[1]: Started session-20.scope - Session 20 of User core. May 16 16:44:01.824838 sshd[5895]: Connection closed by 147.75.109.163 port 42958 May 16 16:44:01.827339 sshd-session[5893]: pam_unix(sshd:session): session closed for user core May 16 16:44:01.831309 systemd[1]: sshd@17-139.178.70.106:22-147.75.109.163:42958.service: Deactivated successfully. May 16 16:44:01.833034 systemd[1]: session-20.scope: Deactivated successfully. May 16 16:44:01.834642 systemd-logind[1588]: Session 20 logged out. Waiting for processes to exit. May 16 16:44:01.837967 systemd[1]: Started sshd@18-139.178.70.106:22-147.75.109.163:42960.service - OpenSSH per-connection server daemon (147.75.109.163:42960). May 16 16:44:01.848266 systemd-logind[1588]: Removed session 20. May 16 16:44:01.916930 sshd[5905]: Accepted publickey for core from 147.75.109.163 port 42960 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:44:01.917751 sshd-session[5905]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:44:01.922762 systemd-logind[1588]: New session 21 of user core. May 16 16:44:01.928867 systemd[1]: Started session-21.scope - Session 21 of User core. May 16 16:44:04.469287 sshd[5907]: Connection closed by 147.75.109.163 port 42960 May 16 16:44:04.487227 sshd-session[5905]: pam_unix(sshd:session): session closed for user core May 16 16:44:04.558740 systemd[1]: Started sshd@19-139.178.70.106:22-147.75.109.163:42976.service - OpenSSH per-connection server daemon (147.75.109.163:42976). May 16 16:44:04.559045 systemd[1]: sshd@18-139.178.70.106:22-147.75.109.163:42960.service: Deactivated successfully. May 16 16:44:04.561442 systemd[1]: session-21.scope: Deactivated successfully. May 16 16:44:04.561564 systemd[1]: session-21.scope: Consumed 376ms CPU time, 73.5M memory peak. May 16 16:44:04.562436 systemd-logind[1588]: Session 21 logged out. Waiting for processes to exit. May 16 16:44:04.564543 systemd-logind[1588]: Removed session 21. May 16 16:44:04.764038 sshd[5926]: Accepted publickey for core from 147.75.109.163 port 42976 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:44:04.765692 sshd-session[5926]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:44:04.782093 systemd-logind[1588]: New session 22 of user core. May 16 16:44:04.785833 systemd[1]: Started session-22.scope - Session 22 of User core. May 16 16:44:05.708023 kubelet[2904]: E0516 16:44:05.681216 2904 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.875s" May 16 16:44:06.091248 kubelet[2904]: E0516 16:44:06.091155 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:44:07.206482 containerd[1610]: time="2025-05-16T16:44:07.206434991Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362\" id:\"bd32b9edfc4b16f8c71e32a070c969fcda837e7a15b5b739ea4d7914f0c9434f\" pid:5954 exited_at:{seconds:1747413847 nanos:183511861}" May 16 16:44:07.437948 sshd[5935]: Connection closed by 147.75.109.163 port 42976 May 16 16:44:07.447929 sshd-session[5926]: pam_unix(sshd:session): session closed for user core May 16 16:44:07.623275 systemd[1]: sshd@19-139.178.70.106:22-147.75.109.163:42976.service: Deactivated successfully. May 16 16:44:07.624365 systemd[1]: session-22.scope: Deactivated successfully. May 16 16:44:07.624471 systemd[1]: session-22.scope: Consumed 648ms CPU time, 69M memory peak. May 16 16:44:07.624833 systemd-logind[1588]: Session 22 logged out. Waiting for processes to exit. May 16 16:44:07.626178 systemd[1]: Started sshd@20-139.178.70.106:22-147.75.109.163:42986.service - OpenSSH per-connection server daemon (147.75.109.163:42986). May 16 16:44:07.626659 systemd-logind[1588]: Removed session 22. May 16 16:44:08.333749 sshd[5968]: Accepted publickey for core from 147.75.109.163 port 42986 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:44:08.336335 sshd-session[5968]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:44:08.344510 systemd-logind[1588]: New session 23 of user core. May 16 16:44:08.349040 systemd[1]: Started session-23.scope - Session 23 of User core. May 16 16:44:08.577407 kubelet[2904]: E0516 16:44:08.576227 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:44:08.657817 containerd[1610]: time="2025-05-16T16:44:08.657556563Z" level=info msg="StopPodSandbox for \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\"" May 16 16:44:09.210208 sshd[5971]: Connection closed by 147.75.109.163 port 42986 May 16 16:44:09.219301 sshd-session[5968]: pam_unix(sshd:session): session closed for user core May 16 16:44:09.231989 systemd[1]: sshd@20-139.178.70.106:22-147.75.109.163:42986.service: Deactivated successfully. May 16 16:44:09.234152 systemd[1]: session-23.scope: Deactivated successfully. May 16 16:44:09.235424 systemd-logind[1588]: Session 23 logged out. Waiting for processes to exit. May 16 16:44:09.238219 systemd-logind[1588]: Removed session 23. May 16 16:44:09.476921 containerd[1610]: time="2025-05-16T16:44:09.476825722Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c308bfe5138a8939fc708973c56bf18c4226af81557c735072eead0f81c23362\" id:\"1a153b443dfc9b5b100d392f504f0cefa6888eed11b171bfc1e2fbd8bd80edd0\" pid:6008 exited_at:{seconds:1747413849 nanos:424250343}" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:09.760 [WARNING][5988] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:09.768 [INFO][5988] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:09.768 [INFO][5988] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" iface="eth0" netns="" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:09.768 [INFO][5988] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:09.768 [INFO][5988] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:10.278 [INFO][6020] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:10.280 [INFO][6020] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:10.282 [INFO][6020] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:10.304 [WARNING][6020] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:10.305 [INFO][6020] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:10.306 [INFO][6020] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:44:10.332293 containerd[1610]: 2025-05-16 16:44:10.308 [INFO][5988] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:44:10.370579 containerd[1610]: time="2025-05-16T16:44:10.346049306Z" level=info msg="TearDown network for sandbox \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" successfully" May 16 16:44:10.370579 containerd[1610]: time="2025-05-16T16:44:10.346088128Z" level=info msg="StopPodSandbox for \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" returns successfully" May 16 16:44:10.507451 update_engine[1592]: I20250516 16:44:10.507380 1592 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 16 16:44:10.507451 update_engine[1592]: I20250516 16:44:10.507447 1592 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 16 16:44:10.516457 update_engine[1592]: I20250516 16:44:10.516422 1592 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 16 16:44:10.518095 update_engine[1592]: I20250516 16:44:10.518075 1592 omaha_request_params.cc:62] Current group set to developer May 16 16:44:10.528953 update_engine[1592]: I20250516 16:44:10.528280 1592 update_attempter.cc:499] Already updated boot flags. Skipping. May 16 16:44:10.528953 update_engine[1592]: I20250516 16:44:10.528304 1592 update_attempter.cc:643] Scheduling an action processor start. May 16 16:44:10.528953 update_engine[1592]: I20250516 16:44:10.528326 1592 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 16 16:44:10.528953 update_engine[1592]: I20250516 16:44:10.528367 1592 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 16 16:44:10.532415 update_engine[1592]: I20250516 16:44:10.532362 1592 omaha_request_action.cc:271] Posting an Omaha request to disabled May 16 16:44:10.535580 update_engine[1592]: I20250516 16:44:10.532460 1592 omaha_request_action.cc:272] Request: May 16 16:44:10.535580 update_engine[1592]: May 16 16:44:10.535580 update_engine[1592]: May 16 16:44:10.535580 update_engine[1592]: May 16 16:44:10.535580 update_engine[1592]: May 16 16:44:10.535580 update_engine[1592]: May 16 16:44:10.535580 update_engine[1592]: May 16 16:44:10.535580 update_engine[1592]: May 16 16:44:10.535580 update_engine[1592]: May 16 16:44:10.535580 update_engine[1592]: I20250516 16:44:10.532470 1592 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 16 16:44:10.556766 locksmithd[1625]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 16 16:44:10.574415 update_engine[1592]: I20250516 16:44:10.574272 1592 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 16 16:44:10.578661 update_engine[1592]: I20250516 16:44:10.575126 1592 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 16 16:44:10.647622 update_engine[1592]: E20250516 16:44:10.647531 1592 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 16 16:44:10.647622 update_engine[1592]: I20250516 16:44:10.647604 1592 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 16 16:44:10.717385 containerd[1610]: time="2025-05-16T16:44:10.717345927Z" level=info msg="RemovePodSandbox for \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\"" May 16 16:44:10.717385 containerd[1610]: time="2025-05-16T16:44:10.717392287Z" level=info msg="Forcibly stopping sandbox \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\"" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.017 [WARNING][6042] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.024 [INFO][6042] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.024 [INFO][6042] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" iface="eth0" netns="" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.024 [INFO][6042] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.024 [INFO][6042] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.586 [INFO][6050] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.589 [INFO][6050] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.589 [INFO][6050] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.613 [WARNING][6050] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.613 [INFO][6050] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" HandleID="k8s-pod-network.005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--mxd6d-eth0" May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.615 [INFO][6050] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:44:11.624090 containerd[1610]: 2025-05-16 16:44:11.618 [INFO][6042] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac" May 16 16:44:11.627550 containerd[1610]: time="2025-05-16T16:44:11.624130505Z" level=info msg="TearDown network for sandbox \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" successfully" May 16 16:44:11.708827 containerd[1610]: time="2025-05-16T16:44:11.708712160Z" level=info msg="Ensure that sandbox 005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac in task-service has been cleanup successfully" May 16 16:44:11.795791 containerd[1610]: time="2025-05-16T16:44:11.795761411Z" level=info msg="RemovePodSandbox \"005f11fac073bd14f13b0d52299e657ae589fed72e96fa96a9a47edf021efcac\" returns successfully" May 16 16:44:11.859645 containerd[1610]: time="2025-05-16T16:44:11.859614367Z" level=info msg="StopPodSandbox for \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\"" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:11.954 [WARNING][6085] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:11.954 [INFO][6085] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:11.955 [INFO][6085] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" iface="eth0" netns="" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:11.955 [INFO][6085] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:11.955 [INFO][6085] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:12.007 [INFO][6092] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:12.007 [INFO][6092] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:12.007 [INFO][6092] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:12.012 [WARNING][6092] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:12.012 [INFO][6092] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:12.013 [INFO][6092] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:44:12.017613 containerd[1610]: 2025-05-16 16:44:12.016 [INFO][6085] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:44:12.030610 containerd[1610]: time="2025-05-16T16:44:12.017643186Z" level=info msg="TearDown network for sandbox \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" successfully" May 16 16:44:12.030610 containerd[1610]: time="2025-05-16T16:44:12.017659298Z" level=info msg="StopPodSandbox for \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" returns successfully" May 16 16:44:12.067278 containerd[1610]: time="2025-05-16T16:44:12.067025797Z" level=info msg="RemovePodSandbox for \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\"" May 16 16:44:12.067278 containerd[1610]: time="2025-05-16T16:44:12.067053080Z" level=info msg="Forcibly stopping sandbox \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\"" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.118 [WARNING][6106] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" WorkloadEndpoint="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.118 [INFO][6106] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.118 [INFO][6106] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" iface="eth0" netns="" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.118 [INFO][6106] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.118 [INFO][6106] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.140 [INFO][6113] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.140 [INFO][6113] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.140 [INFO][6113] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.147 [WARNING][6113] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.147 [INFO][6113] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" HandleID="k8s-pod-network.ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" Workload="localhost-k8s-calico--apiserver--bd5dbfdd--7fnf8-eth0" May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.148 [INFO][6113] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:44:12.152509 containerd[1610]: 2025-05-16 16:44:12.150 [INFO][6106] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4" May 16 16:44:12.173481 containerd[1610]: time="2025-05-16T16:44:12.152531134Z" level=info msg="TearDown network for sandbox \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" successfully" May 16 16:44:12.173481 containerd[1610]: time="2025-05-16T16:44:12.162433485Z" level=info msg="Ensure that sandbox ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4 in task-service has been cleanup successfully" May 16 16:44:12.173481 containerd[1610]: time="2025-05-16T16:44:12.170918266Z" level=info msg="RemovePodSandbox \"ef018003574be2a522e07a0fc854d3f7d8e0d5cce27f8efe329efae0c29138f4\" returns successfully" May 16 16:44:12.792519 containerd[1610]: time="2025-05-16T16:44:12.792484740Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86a2ac6a0f6dae37b2128ada43a52e8091f781d861c1fef24da93d4a8ab39853\" id:\"98d0db3212a079bc48c1b8ea6e5fa9aecfc469af9c4f6de787b23c78358c39de\" pid:6072 exited_at:{seconds:1747413852 nanos:763479994}" May 16 16:44:14.276133 systemd[1]: Started sshd@21-139.178.70.106:22-147.75.109.163:37740.service - OpenSSH per-connection server daemon (147.75.109.163:37740). May 16 16:44:14.434459 sshd[6128]: Accepted publickey for core from 147.75.109.163 port 37740 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:44:14.436382 sshd-session[6128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:44:14.447569 systemd-logind[1588]: New session 24 of user core. May 16 16:44:14.452813 systemd[1]: Started session-24.scope - Session 24 of User core. May 16 16:44:15.184629 sshd[6130]: Connection closed by 147.75.109.163 port 37740 May 16 16:44:15.185049 sshd-session[6128]: pam_unix(sshd:session): session closed for user core May 16 16:44:15.191985 systemd[1]: sshd@21-139.178.70.106:22-147.75.109.163:37740.service: Deactivated successfully. May 16 16:44:15.193187 systemd[1]: session-24.scope: Deactivated successfully. May 16 16:44:15.199822 systemd-logind[1588]: Session 24 logged out. Waiting for processes to exit. May 16 16:44:15.206105 systemd-logind[1588]: Removed session 24. May 16 16:44:18.047699 containerd[1610]: time="2025-05-16T16:44:18.047519808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 16:44:18.437221 containerd[1610]: time="2025-05-16T16:44:18.437180331Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:44:18.438022 containerd[1610]: time="2025-05-16T16:44:18.438004987Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 16:44:18.439363 containerd[1610]: time="2025-05-16T16:44:18.439335433Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:44:18.453320 kubelet[2904]: E0516 16:44:18.451947 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:44:18.461787 kubelet[2904]: E0516 16:44:18.457284 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:44:18.482428 kubelet[2904]: E0516 16:44:18.482394 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:c110a762b15a493fa59124ec728c2d93,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82ddw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54f476585f-nqgnj_calico-system(3f14b109-fa73-4313-9c07-3d3c314a58ba): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:44:18.490815 containerd[1610]: time="2025-05-16T16:44:18.490789861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 16:44:18.730886 containerd[1610]: time="2025-05-16T16:44:18.730803269Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:44:18.731192 containerd[1610]: time="2025-05-16T16:44:18.731153483Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:44:18.731270 containerd[1610]: time="2025-05-16T16:44:18.731228678Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 16:44:18.731533 kubelet[2904]: E0516 16:44:18.731473 2904 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:44:18.731778 kubelet[2904]: E0516 16:44:18.731624 2904 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:44:18.731778 kubelet[2904]: E0516 16:44:18.731717 2904 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82ddw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-54f476585f-nqgnj_calico-system(3f14b109-fa73-4313-9c07-3d3c314a58ba): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:44:18.738655 kubelet[2904]: E0516 16:44:18.738603 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-54f476585f-nqgnj" podUID="3f14b109-fa73-4313-9c07-3d3c314a58ba" May 16 16:44:20.248052 systemd[1]: Started sshd@22-139.178.70.106:22-147.75.109.163:45554.service - OpenSSH per-connection server daemon (147.75.109.163:45554). May 16 16:44:20.392765 sshd[6143]: Accepted publickey for core from 147.75.109.163 port 45554 ssh2: RSA SHA256:ybAKhkONUtPx2kz0EW8NbEw3lO01G9uOeF6UALOI7Jc May 16 16:44:20.395416 sshd-session[6143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:44:20.400751 systemd-logind[1588]: New session 25 of user core. May 16 16:44:20.406848 systemd[1]: Started session-25.scope - Session 25 of User core. May 16 16:44:20.581285 kubelet[2904]: E0516 16:44:20.581218 2904 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-7x2n4" podUID="f47c5b55-7da4-4cde-9213-a21e48a7736b" May 16 16:44:21.171367 sshd[6145]: Connection closed by 147.75.109.163 port 45554 May 16 16:44:21.170998 sshd-session[6143]: pam_unix(sshd:session): session closed for user core May 16 16:44:21.173064 systemd[1]: sshd@22-139.178.70.106:22-147.75.109.163:45554.service: Deactivated successfully. May 16 16:44:21.175193 systemd[1]: session-25.scope: Deactivated successfully. May 16 16:44:21.177762 systemd-logind[1588]: Session 25 logged out. Waiting for processes to exit. May 16 16:44:21.179338 systemd-logind[1588]: Removed session 25. May 16 16:44:21.397989 update_engine[1592]: I20250516 16:44:21.391673 1592 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 16 16:44:21.424798 update_engine[1592]: I20250516 16:44:21.424489 1592 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 16 16:44:21.424798 update_engine[1592]: I20250516 16:44:21.424696 1592 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 16 16:44:21.430771 update_engine[1592]: E20250516 16:44:21.430750 1592 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 16 16:44:21.430869 update_engine[1592]: I20250516 16:44:21.430855 1592 libcurl_http_fetcher.cc:283] No HTTP response, retry 2