May 27 03:51:14.188222 kernel: Booting Linux on physical CPU 0x0000120000 [0x413fd0c1] May 27 03:51:14.188245 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 27 01:20:04 -00 2025 May 27 03:51:14.188253 kernel: KASLR enabled May 27 03:51:14.188258 kernel: efi: EFI v2.7 by American Megatrends May 27 03:51:14.188264 kernel: efi: ACPI 2.0=0xec080000 SMBIOS 3.0=0xf0a1ff98 ESRT=0xea47e818 RNG=0xebf10018 MEMRESERVE=0xe4660f98 May 27 03:51:14.188269 kernel: random: crng init done May 27 03:51:14.188276 kernel: secureboot: Secure boot disabled May 27 03:51:14.188282 kernel: esrt: Reserving ESRT space from 0x00000000ea47e818 to 0x00000000ea47e878. May 27 03:51:14.188289 kernel: ACPI: Early table checksum verification disabled May 27 03:51:14.188295 kernel: ACPI: RSDP 0x00000000EC080000 000024 (v02 Ampere) May 27 03:51:14.188301 kernel: ACPI: XSDT 0x00000000EC070000 0000A4 (v01 Ampere Altra 00000000 AMI 01000013) May 27 03:51:14.188306 kernel: ACPI: FACP 0x00000000EC050000 000114 (v06 Ampere Altra 00000000 INTL 20190509) May 27 03:51:14.188312 kernel: ACPI: DSDT 0x00000000EBFF0000 019B57 (v02 Ampere Jade 00000001 INTL 20200717) May 27 03:51:14.188318 kernel: ACPI: DBG2 0x00000000EC060000 00005C (v00 Ampere Altra 00000000 INTL 20190509) May 27 03:51:14.188326 kernel: ACPI: GTDT 0x00000000EC040000 000110 (v03 Ampere Altra 00000000 INTL 20190509) May 27 03:51:14.188333 kernel: ACPI: SSDT 0x00000000EC030000 00002D (v02 Ampere Altra 00000001 INTL 20190509) May 27 03:51:14.188339 kernel: ACPI: FIDT 0x00000000EBFE0000 00009C (v01 ALASKA A M I 01072009 AMI 00010013) May 27 03:51:14.188345 kernel: ACPI: SPCR 0x00000000EBFD0000 000050 (v02 ALASKA A M I 01072009 AMI 0005000F) May 27 03:51:14.188351 kernel: ACPI: BGRT 0x00000000EBFC0000 000038 (v01 ALASKA A M I 01072009 AMI 00010013) May 27 03:51:14.188357 kernel: ACPI: MCFG 0x00000000EBFB0000 0000AC (v01 Ampere Altra 00000001 AMP. 01000013) May 27 03:51:14.188363 kernel: ACPI: IORT 0x00000000EBFA0000 000610 (v00 Ampere Altra 00000000 AMP. 01000013) May 27 03:51:14.188369 kernel: ACPI: PPTT 0x00000000EBF80000 006E60 (v02 Ampere Altra 00000000 AMP. 01000013) May 27 03:51:14.188375 kernel: ACPI: SLIT 0x00000000EBF70000 00002D (v01 Ampere Altra 00000000 AMP. 01000013) May 27 03:51:14.188381 kernel: ACPI: SRAT 0x00000000EBF60000 0006D0 (v03 Ampere Altra 00000000 AMP. 01000013) May 27 03:51:14.188388 kernel: ACPI: APIC 0x00000000EBF90000 0019F4 (v05 Ampere Altra 00000003 AMI 01000013) May 27 03:51:14.188394 kernel: ACPI: PCCT 0x00000000EBF40000 000576 (v02 Ampere Altra 00000003 AMP. 01000013) May 27 03:51:14.188400 kernel: ACPI: WSMT 0x00000000EBF30000 000028 (v01 ALASKA A M I 01072009 AMI 00010013) May 27 03:51:14.188406 kernel: ACPI: FPDT 0x00000000EBF20000 000044 (v01 ALASKA A M I 01072009 AMI 01000013) May 27 03:51:14.188412 kernel: ACPI: SPCR: console: pl011,mmio32,0x100002600000,115200 May 27 03:51:14.188418 kernel: ACPI: Use ACPI SPCR as default console: Yes May 27 03:51:14.188424 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x88300000-0x883fffff] May 27 03:51:14.188430 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x90000000-0xffffffff] May 27 03:51:14.188436 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0x8007fffffff] May 27 03:51:14.188442 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80100000000-0x83fffffffff] May 27 03:51:14.188448 kernel: NUMA: Initialized distance table, cnt=1 May 27 03:51:14.188456 kernel: NUMA: Node 0 [mem 0x88300000-0x883fffff] + [mem 0x90000000-0xffffffff] -> [mem 0x88300000-0xffffffff] May 27 03:51:14.188462 kernel: NUMA: Node 0 [mem 0x88300000-0xffffffff] + [mem 0x80000000000-0x8007fffffff] -> [mem 0x88300000-0x8007fffffff] May 27 03:51:14.188468 kernel: NUMA: Node 0 [mem 0x88300000-0x8007fffffff] + [mem 0x80100000000-0x83fffffffff] -> [mem 0x88300000-0x83fffffffff] May 27 03:51:14.188475 kernel: NODE_DATA(0) allocated [mem 0x83fdffd8dc0-0x83fdffdffff] May 27 03:51:14.188481 kernel: Zone ranges: May 27 03:51:14.188490 kernel: DMA [mem 0x0000000088300000-0x00000000ffffffff] May 27 03:51:14.188497 kernel: DMA32 empty May 27 03:51:14.188504 kernel: Normal [mem 0x0000000100000000-0x0000083fffffffff] May 27 03:51:14.188510 kernel: Device empty May 27 03:51:14.188517 kernel: Movable zone start for each node May 27 03:51:14.188523 kernel: Early memory node ranges May 27 03:51:14.188530 kernel: node 0: [mem 0x0000000088300000-0x00000000883fffff] May 27 03:51:14.188536 kernel: node 0: [mem 0x0000000090000000-0x0000000091ffffff] May 27 03:51:14.188542 kernel: node 0: [mem 0x0000000092000000-0x0000000093ffffff] May 27 03:51:14.188548 kernel: node 0: [mem 0x0000000094000000-0x00000000eba37fff] May 27 03:51:14.188555 kernel: node 0: [mem 0x00000000eba38000-0x00000000ebeccfff] May 27 03:51:14.188562 kernel: node 0: [mem 0x00000000ebecd000-0x00000000ebecdfff] May 27 03:51:14.188568 kernel: node 0: [mem 0x00000000ebece000-0x00000000ebecffff] May 27 03:51:14.188575 kernel: node 0: [mem 0x00000000ebed0000-0x00000000ec0effff] May 27 03:51:14.188581 kernel: node 0: [mem 0x00000000ec0f0000-0x00000000ec0fffff] May 27 03:51:14.188587 kernel: node 0: [mem 0x00000000ec100000-0x00000000ee53ffff] May 27 03:51:14.188594 kernel: node 0: [mem 0x00000000ee540000-0x00000000f765ffff] May 27 03:51:14.188600 kernel: node 0: [mem 0x00000000f7660000-0x00000000f784ffff] May 27 03:51:14.188606 kernel: node 0: [mem 0x00000000f7850000-0x00000000f7fdffff] May 27 03:51:14.188613 kernel: node 0: [mem 0x00000000f7fe0000-0x00000000ffc8efff] May 27 03:51:14.188620 kernel: node 0: [mem 0x00000000ffc8f000-0x00000000ffc8ffff] May 27 03:51:14.188627 kernel: node 0: [mem 0x00000000ffc90000-0x00000000ffffffff] May 27 03:51:14.188633 kernel: node 0: [mem 0x0000080000000000-0x000008007fffffff] May 27 03:51:14.188641 kernel: node 0: [mem 0x0000080100000000-0x0000083fffffffff] May 27 03:51:14.188647 kernel: Initmem setup node 0 [mem 0x0000000088300000-0x0000083fffffffff] May 27 03:51:14.188654 kernel: On node 0, zone DMA: 768 pages in unavailable ranges May 27 03:51:14.188660 kernel: On node 0, zone DMA: 31744 pages in unavailable ranges May 27 03:51:14.188666 kernel: psci: probing for conduit method from ACPI. May 27 03:51:14.188673 kernel: psci: PSCIv1.1 detected in firmware. May 27 03:51:14.188679 kernel: psci: Using standard PSCI v0.2 function IDs May 27 03:51:14.188685 kernel: psci: MIGRATE_INFO_TYPE not supported. May 27 03:51:14.188692 kernel: psci: SMC Calling Convention v1.2 May 27 03:51:14.188698 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 May 27 03:51:14.188704 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x100 -> Node 0 May 27 03:51:14.188711 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x10000 -> Node 0 May 27 03:51:14.188718 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x10100 -> Node 0 May 27 03:51:14.188725 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x20000 -> Node 0 May 27 03:51:14.188731 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x20100 -> Node 0 May 27 03:51:14.188738 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x30000 -> Node 0 May 27 03:51:14.188744 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x30100 -> Node 0 May 27 03:51:14.188751 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x40000 -> Node 0 May 27 03:51:14.188757 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x40100 -> Node 0 May 27 03:51:14.188764 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x50000 -> Node 0 May 27 03:51:14.188770 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x50100 -> Node 0 May 27 03:51:14.188776 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x60000 -> Node 0 May 27 03:51:14.188782 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x60100 -> Node 0 May 27 03:51:14.188790 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x70000 -> Node 0 May 27 03:51:14.188796 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x70100 -> Node 0 May 27 03:51:14.188803 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x80000 -> Node 0 May 27 03:51:14.188809 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x80100 -> Node 0 May 27 03:51:14.188815 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x90000 -> Node 0 May 27 03:51:14.188821 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x90100 -> Node 0 May 27 03:51:14.188828 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xa0000 -> Node 0 May 27 03:51:14.188834 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xa0100 -> Node 0 May 27 03:51:14.188840 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xb0000 -> Node 0 May 27 03:51:14.188847 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xb0100 -> Node 0 May 27 03:51:14.188853 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xc0000 -> Node 0 May 27 03:51:14.188860 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xc0100 -> Node 0 May 27 03:51:14.188867 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xd0000 -> Node 0 May 27 03:51:14.188874 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xd0100 -> Node 0 May 27 03:51:14.188880 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xe0000 -> Node 0 May 27 03:51:14.188887 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xe0100 -> Node 0 May 27 03:51:14.188893 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xf0000 -> Node 0 May 27 03:51:14.188899 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xf0100 -> Node 0 May 27 03:51:14.188906 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x100000 -> Node 0 May 27 03:51:14.188912 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x100100 -> Node 0 May 27 03:51:14.188918 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x110000 -> Node 0 May 27 03:51:14.188925 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x110100 -> Node 0 May 27 03:51:14.188931 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x120000 -> Node 0 May 27 03:51:14.188939 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x120100 -> Node 0 May 27 03:51:14.188945 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x130000 -> Node 0 May 27 03:51:14.188951 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x130100 -> Node 0 May 27 03:51:14.188958 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x140000 -> Node 0 May 27 03:51:14.188964 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x140100 -> Node 0 May 27 03:51:14.188970 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x150000 -> Node 0 May 27 03:51:14.188977 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x150100 -> Node 0 May 27 03:51:14.188983 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x160000 -> Node 0 May 27 03:51:14.188990 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x160100 -> Node 0 May 27 03:51:14.189003 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x170000 -> Node 0 May 27 03:51:14.189010 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x170100 -> Node 0 May 27 03:51:14.189017 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x180000 -> Node 0 May 27 03:51:14.189024 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x180100 -> Node 0 May 27 03:51:14.189031 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x190000 -> Node 0 May 27 03:51:14.189037 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x190100 -> Node 0 May 27 03:51:14.189044 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1a0000 -> Node 0 May 27 03:51:14.189052 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1a0100 -> Node 0 May 27 03:51:14.189059 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1b0000 -> Node 0 May 27 03:51:14.189066 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1b0100 -> Node 0 May 27 03:51:14.189072 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1c0000 -> Node 0 May 27 03:51:14.189079 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1c0100 -> Node 0 May 27 03:51:14.189086 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1d0000 -> Node 0 May 27 03:51:14.189093 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1d0100 -> Node 0 May 27 03:51:14.189100 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1e0000 -> Node 0 May 27 03:51:14.189107 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1e0100 -> Node 0 May 27 03:51:14.189114 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1f0000 -> Node 0 May 27 03:51:14.189121 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1f0100 -> Node 0 May 27 03:51:14.189127 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x200000 -> Node 0 May 27 03:51:14.189135 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x200100 -> Node 0 May 27 03:51:14.189142 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x210000 -> Node 0 May 27 03:51:14.189149 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x210100 -> Node 0 May 27 03:51:14.189155 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x220000 -> Node 0 May 27 03:51:14.189162 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x220100 -> Node 0 May 27 03:51:14.189169 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x230000 -> Node 0 May 27 03:51:14.189176 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x230100 -> Node 0 May 27 03:51:14.189182 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x240000 -> Node 0 May 27 03:51:14.189189 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x240100 -> Node 0 May 27 03:51:14.189196 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x250000 -> Node 0 May 27 03:51:14.189202 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x250100 -> Node 0 May 27 03:51:14.189213 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x260000 -> Node 0 May 27 03:51:14.189222 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x260100 -> Node 0 May 27 03:51:14.189228 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x270000 -> Node 0 May 27 03:51:14.189235 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x270100 -> Node 0 May 27 03:51:14.189242 kernel: percpu: Embedded 33 pages/cpu s98136 r8192 d28840 u135168 May 27 03:51:14.189248 kernel: pcpu-alloc: s98136 r8192 d28840 u135168 alloc=33*4096 May 27 03:51:14.189255 kernel: pcpu-alloc: [0] 00 [0] 01 [0] 02 [0] 03 [0] 04 [0] 05 [0] 06 [0] 07 May 27 03:51:14.189262 kernel: pcpu-alloc: [0] 08 [0] 09 [0] 10 [0] 11 [0] 12 [0] 13 [0] 14 [0] 15 May 27 03:51:14.189269 kernel: pcpu-alloc: [0] 16 [0] 17 [0] 18 [0] 19 [0] 20 [0] 21 [0] 22 [0] 23 May 27 03:51:14.189275 kernel: pcpu-alloc: [0] 24 [0] 25 [0] 26 [0] 27 [0] 28 [0] 29 [0] 30 [0] 31 May 27 03:51:14.189282 kernel: pcpu-alloc: [0] 32 [0] 33 [0] 34 [0] 35 [0] 36 [0] 37 [0] 38 [0] 39 May 27 03:51:14.189289 kernel: pcpu-alloc: [0] 40 [0] 41 [0] 42 [0] 43 [0] 44 [0] 45 [0] 46 [0] 47 May 27 03:51:14.189297 kernel: pcpu-alloc: [0] 48 [0] 49 [0] 50 [0] 51 [0] 52 [0] 53 [0] 54 [0] 55 May 27 03:51:14.189304 kernel: pcpu-alloc: [0] 56 [0] 57 [0] 58 [0] 59 [0] 60 [0] 61 [0] 62 [0] 63 May 27 03:51:14.189311 kernel: pcpu-alloc: [0] 64 [0] 65 [0] 66 [0] 67 [0] 68 [0] 69 [0] 70 [0] 71 May 27 03:51:14.189317 kernel: pcpu-alloc: [0] 72 [0] 73 [0] 74 [0] 75 [0] 76 [0] 77 [0] 78 [0] 79 May 27 03:51:14.189324 kernel: Detected PIPT I-cache on CPU0 May 27 03:51:14.189331 kernel: CPU features: detected: GIC system register CPU interface May 27 03:51:14.189337 kernel: CPU features: detected: Virtualization Host Extensions May 27 03:51:14.189344 kernel: CPU features: detected: Spectre-v4 May 27 03:51:14.189351 kernel: CPU features: detected: Spectre-BHB May 27 03:51:14.189358 kernel: CPU features: kernel page table isolation forced ON by KASLR May 27 03:51:14.189364 kernel: CPU features: detected: Kernel page table isolation (KPTI) May 27 03:51:14.189372 kernel: CPU features: detected: ARM erratum 1418040 May 27 03:51:14.189379 kernel: CPU features: detected: SSBS not fully self-synchronizing May 27 03:51:14.189386 kernel: alternatives: applying boot alternatives May 27 03:51:14.189394 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=packet flatcar.autologin verity.usrhash=4c3f98aae7a61b3dcbab6391ba922461adab29dbcb79fd6e18169f93c5a4ab5a May 27 03:51:14.189401 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 03:51:14.189408 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes May 27 03:51:14.189415 kernel: printk: log_buf_len total cpu_extra contributions: 323584 bytes May 27 03:51:14.189422 kernel: printk: log_buf_len min size: 262144 bytes May 27 03:51:14.189428 kernel: printk: log_buf_len: 1048576 bytes May 27 03:51:14.189435 kernel: printk: early log buf free: 249568(95%) May 27 03:51:14.189442 kernel: Dentry cache hash table entries: 16777216 (order: 15, 134217728 bytes, linear) May 27 03:51:14.189450 kernel: Inode-cache hash table entries: 8388608 (order: 14, 67108864 bytes, linear) May 27 03:51:14.189457 kernel: Fallback order for Node 0: 0 May 27 03:51:14.189464 kernel: Built 1 zonelists, mobility grouping on. Total pages: 67043584 May 27 03:51:14.189471 kernel: Policy zone: Normal May 27 03:51:14.189477 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 03:51:14.189484 kernel: software IO TLB: area num 128. May 27 03:51:14.189491 kernel: software IO TLB: mapped [mem 0x00000000fbc8f000-0x00000000ffc8f000] (64MB) May 27 03:51:14.189498 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=80, Nodes=1 May 27 03:51:14.189505 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 03:51:14.189512 kernel: rcu: RCU event tracing is enabled. May 27 03:51:14.189519 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=80. May 27 03:51:14.189527 kernel: Trampoline variant of Tasks RCU enabled. May 27 03:51:14.189534 kernel: Tracing variant of Tasks RCU enabled. May 27 03:51:14.189541 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 03:51:14.189548 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=80 May 27 03:51:14.189555 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=80. May 27 03:51:14.189562 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=80. May 27 03:51:14.189569 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 27 03:51:14.189576 kernel: GICv3: GIC: Using split EOI/Deactivate mode May 27 03:51:14.189582 kernel: GICv3: 672 SPIs implemented May 27 03:51:14.189589 kernel: GICv3: 0 Extended SPIs implemented May 27 03:51:14.189596 kernel: Root IRQ handler: gic_handle_irq May 27 03:51:14.189602 kernel: GICv3: GICv3 features: 16 PPIs May 27 03:51:14.189610 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=1 May 27 03:51:14.189617 kernel: GICv3: CPU0: found redistributor 120000 region 0:0x00001001005c0000 May 27 03:51:14.189624 kernel: SRAT: PXM 0 -> ITS 0 -> Node 0 May 27 03:51:14.189631 kernel: SRAT: PXM 0 -> ITS 1 -> Node 0 May 27 03:51:14.189637 kernel: SRAT: PXM 0 -> ITS 2 -> Node 0 May 27 03:51:14.189644 kernel: SRAT: PXM 0 -> ITS 3 -> Node 0 May 27 03:51:14.189650 kernel: SRAT: PXM 0 -> ITS 4 -> Node 0 May 27 03:51:14.189657 kernel: SRAT: PXM 0 -> ITS 5 -> Node 0 May 27 03:51:14.189664 kernel: SRAT: PXM 0 -> ITS 6 -> Node 0 May 27 03:51:14.189670 kernel: SRAT: PXM 0 -> ITS 7 -> Node 0 May 27 03:51:14.189677 kernel: ITS [mem 0x100100040000-0x10010005ffff] May 27 03:51:14.189684 kernel: ITS@0x0000100100040000: allocated 8192 Devices @80000310000 (indirect, esz 8, psz 64K, shr 1) May 27 03:51:14.189692 kernel: ITS@0x0000100100040000: allocated 32768 Interrupt Collections @80000320000 (flat, esz 2, psz 64K, shr 1) May 27 03:51:14.189699 kernel: ITS [mem 0x100100060000-0x10010007ffff] May 27 03:51:14.189706 kernel: ITS@0x0000100100060000: allocated 8192 Devices @80000340000 (indirect, esz 8, psz 64K, shr 1) May 27 03:51:14.189713 kernel: ITS@0x0000100100060000: allocated 32768 Interrupt Collections @80000350000 (flat, esz 2, psz 64K, shr 1) May 27 03:51:14.189720 kernel: ITS [mem 0x100100080000-0x10010009ffff] May 27 03:51:14.189727 kernel: ITS@0x0000100100080000: allocated 8192 Devices @80000370000 (indirect, esz 8, psz 64K, shr 1) May 27 03:51:14.189734 kernel: ITS@0x0000100100080000: allocated 32768 Interrupt Collections @80000380000 (flat, esz 2, psz 64K, shr 1) May 27 03:51:14.189741 kernel: ITS [mem 0x1001000a0000-0x1001000bffff] May 27 03:51:14.189747 kernel: ITS@0x00001001000a0000: allocated 8192 Devices @800003a0000 (indirect, esz 8, psz 64K, shr 1) May 27 03:51:14.189754 kernel: ITS@0x00001001000a0000: allocated 32768 Interrupt Collections @800003b0000 (flat, esz 2, psz 64K, shr 1) May 27 03:51:14.189761 kernel: ITS [mem 0x1001000c0000-0x1001000dffff] May 27 03:51:14.189769 kernel: ITS@0x00001001000c0000: allocated 8192 Devices @800003d0000 (indirect, esz 8, psz 64K, shr 1) May 27 03:51:14.189776 kernel: ITS@0x00001001000c0000: allocated 32768 Interrupt Collections @800003e0000 (flat, esz 2, psz 64K, shr 1) May 27 03:51:14.189783 kernel: ITS [mem 0x1001000e0000-0x1001000fffff] May 27 03:51:14.189790 kernel: ITS@0x00001001000e0000: allocated 8192 Devices @80000800000 (indirect, esz 8, psz 64K, shr 1) May 27 03:51:14.189797 kernel: ITS@0x00001001000e0000: allocated 32768 Interrupt Collections @80000810000 (flat, esz 2, psz 64K, shr 1) May 27 03:51:14.189803 kernel: ITS [mem 0x100100100000-0x10010011ffff] May 27 03:51:14.189810 kernel: ITS@0x0000100100100000: allocated 8192 Devices @80000830000 (indirect, esz 8, psz 64K, shr 1) May 27 03:51:14.189817 kernel: ITS@0x0000100100100000: allocated 32768 Interrupt Collections @80000840000 (flat, esz 2, psz 64K, shr 1) May 27 03:51:14.189824 kernel: ITS [mem 0x100100120000-0x10010013ffff] May 27 03:51:14.189831 kernel: ITS@0x0000100100120000: allocated 8192 Devices @80000860000 (indirect, esz 8, psz 64K, shr 1) May 27 03:51:14.189838 kernel: ITS@0x0000100100120000: allocated 32768 Interrupt Collections @80000870000 (flat, esz 2, psz 64K, shr 1) May 27 03:51:14.189846 kernel: GICv3: using LPI property table @0x0000080000880000 May 27 03:51:14.189853 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000080000890000 May 27 03:51:14.189860 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 03:51:14.189866 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.189873 kernel: ACPI GTDT: found 1 memory-mapped timer block(s). May 27 03:51:14.189880 kernel: arch_timer: cp15 and mmio timer(s) running at 25.00MHz (phys/phys). May 27 03:51:14.189887 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns May 27 03:51:14.189894 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns May 27 03:51:14.189901 kernel: Console: colour dummy device 80x25 May 27 03:51:14.189908 kernel: printk: legacy console [tty0] enabled May 27 03:51:14.189915 kernel: ACPI: Core revision 20240827 May 27 03:51:14.189923 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) May 27 03:51:14.189930 kernel: pid_max: default: 81920 minimum: 640 May 27 03:51:14.189937 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 03:51:14.189944 kernel: landlock: Up and running. May 27 03:51:14.189951 kernel: SELinux: Initializing. May 27 03:51:14.189958 kernel: Mount-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 03:51:14.189966 kernel: Mountpoint-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 03:51:14.189973 kernel: rcu: Hierarchical SRCU implementation. May 27 03:51:14.189979 kernel: rcu: Max phase no-delay instances is 400. May 27 03:51:14.189988 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level May 27 03:51:14.189995 kernel: Remapping and enabling EFI services. May 27 03:51:14.190002 kernel: smp: Bringing up secondary CPUs ... May 27 03:51:14.190009 kernel: Detected PIPT I-cache on CPU1 May 27 03:51:14.190016 kernel: GICv3: CPU1: found redistributor 1a0000 region 0:0x00001001007c0000 May 27 03:51:14.190023 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000800008a0000 May 27 03:51:14.190029 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190036 kernel: CPU1: Booted secondary processor 0x00001a0000 [0x413fd0c1] May 27 03:51:14.190043 kernel: Detected PIPT I-cache on CPU2 May 27 03:51:14.190052 kernel: GICv3: CPU2: found redistributor 140000 region 0:0x0000100100640000 May 27 03:51:14.190059 kernel: GICv3: CPU2: using allocated LPI pending table @0x00000800008b0000 May 27 03:51:14.190066 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190072 kernel: CPU2: Booted secondary processor 0x0000140000 [0x413fd0c1] May 27 03:51:14.190079 kernel: Detected PIPT I-cache on CPU3 May 27 03:51:14.190086 kernel: GICv3: CPU3: found redistributor 1c0000 region 0:0x0000100100840000 May 27 03:51:14.190093 kernel: GICv3: CPU3: using allocated LPI pending table @0x00000800008c0000 May 27 03:51:14.190100 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190107 kernel: CPU3: Booted secondary processor 0x00001c0000 [0x413fd0c1] May 27 03:51:14.190114 kernel: Detected PIPT I-cache on CPU4 May 27 03:51:14.190122 kernel: GICv3: CPU4: found redistributor 100000 region 0:0x0000100100540000 May 27 03:51:14.190129 kernel: GICv3: CPU4: using allocated LPI pending table @0x00000800008d0000 May 27 03:51:14.190136 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190142 kernel: CPU4: Booted secondary processor 0x0000100000 [0x413fd0c1] May 27 03:51:14.190149 kernel: Detected PIPT I-cache on CPU5 May 27 03:51:14.190156 kernel: GICv3: CPU5: found redistributor 180000 region 0:0x0000100100740000 May 27 03:51:14.190163 kernel: GICv3: CPU5: using allocated LPI pending table @0x00000800008e0000 May 27 03:51:14.190170 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190177 kernel: CPU5: Booted secondary processor 0x0000180000 [0x413fd0c1] May 27 03:51:14.190185 kernel: Detected PIPT I-cache on CPU6 May 27 03:51:14.190192 kernel: GICv3: CPU6: found redistributor 160000 region 0:0x00001001006c0000 May 27 03:51:14.190198 kernel: GICv3: CPU6: using allocated LPI pending table @0x00000800008f0000 May 27 03:51:14.190208 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190215 kernel: CPU6: Booted secondary processor 0x0000160000 [0x413fd0c1] May 27 03:51:14.190222 kernel: Detected PIPT I-cache on CPU7 May 27 03:51:14.190229 kernel: GICv3: CPU7: found redistributor 1e0000 region 0:0x00001001008c0000 May 27 03:51:14.190236 kernel: GICv3: CPU7: using allocated LPI pending table @0x0000080000900000 May 27 03:51:14.190243 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190249 kernel: CPU7: Booted secondary processor 0x00001e0000 [0x413fd0c1] May 27 03:51:14.190258 kernel: Detected PIPT I-cache on CPU8 May 27 03:51:14.190265 kernel: GICv3: CPU8: found redistributor a0000 region 0:0x00001001003c0000 May 27 03:51:14.190272 kernel: GICv3: CPU8: using allocated LPI pending table @0x0000080000910000 May 27 03:51:14.190279 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190286 kernel: CPU8: Booted secondary processor 0x00000a0000 [0x413fd0c1] May 27 03:51:14.190292 kernel: Detected PIPT I-cache on CPU9 May 27 03:51:14.190299 kernel: GICv3: CPU9: found redistributor 220000 region 0:0x00001001009c0000 May 27 03:51:14.190306 kernel: GICv3: CPU9: using allocated LPI pending table @0x0000080000920000 May 27 03:51:14.190313 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190321 kernel: CPU9: Booted secondary processor 0x0000220000 [0x413fd0c1] May 27 03:51:14.190328 kernel: Detected PIPT I-cache on CPU10 May 27 03:51:14.190335 kernel: GICv3: CPU10: found redistributor c0000 region 0:0x0000100100440000 May 27 03:51:14.190342 kernel: GICv3: CPU10: using allocated LPI pending table @0x0000080000930000 May 27 03:51:14.190348 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190355 kernel: CPU10: Booted secondary processor 0x00000c0000 [0x413fd0c1] May 27 03:51:14.190362 kernel: Detected PIPT I-cache on CPU11 May 27 03:51:14.190369 kernel: GICv3: CPU11: found redistributor 240000 region 0:0x0000100100a40000 May 27 03:51:14.190376 kernel: GICv3: CPU11: using allocated LPI pending table @0x0000080000940000 May 27 03:51:14.190383 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190391 kernel: CPU11: Booted secondary processor 0x0000240000 [0x413fd0c1] May 27 03:51:14.190398 kernel: Detected PIPT I-cache on CPU12 May 27 03:51:14.190404 kernel: GICv3: CPU12: found redistributor 80000 region 0:0x0000100100340000 May 27 03:51:14.190411 kernel: GICv3: CPU12: using allocated LPI pending table @0x0000080000950000 May 27 03:51:14.190418 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190425 kernel: CPU12: Booted secondary processor 0x0000080000 [0x413fd0c1] May 27 03:51:14.190432 kernel: Detected PIPT I-cache on CPU13 May 27 03:51:14.190439 kernel: GICv3: CPU13: found redistributor 200000 region 0:0x0000100100940000 May 27 03:51:14.190446 kernel: GICv3: CPU13: using allocated LPI pending table @0x0000080000960000 May 27 03:51:14.190454 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190461 kernel: CPU13: Booted secondary processor 0x0000200000 [0x413fd0c1] May 27 03:51:14.190468 kernel: Detected PIPT I-cache on CPU14 May 27 03:51:14.190476 kernel: GICv3: CPU14: found redistributor e0000 region 0:0x00001001004c0000 May 27 03:51:14.190483 kernel: GICv3: CPU14: using allocated LPI pending table @0x0000080000970000 May 27 03:51:14.190490 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190496 kernel: CPU14: Booted secondary processor 0x00000e0000 [0x413fd0c1] May 27 03:51:14.190503 kernel: Detected PIPT I-cache on CPU15 May 27 03:51:14.190510 kernel: GICv3: CPU15: found redistributor 260000 region 0:0x0000100100ac0000 May 27 03:51:14.190518 kernel: GICv3: CPU15: using allocated LPI pending table @0x0000080000980000 May 27 03:51:14.190525 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190532 kernel: CPU15: Booted secondary processor 0x0000260000 [0x413fd0c1] May 27 03:51:14.190539 kernel: Detected PIPT I-cache on CPU16 May 27 03:51:14.190546 kernel: GICv3: CPU16: found redistributor 20000 region 0:0x00001001001c0000 May 27 03:51:14.190553 kernel: GICv3: CPU16: using allocated LPI pending table @0x0000080000990000 May 27 03:51:14.190560 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190566 kernel: CPU16: Booted secondary processor 0x0000020000 [0x413fd0c1] May 27 03:51:14.190573 kernel: Detected PIPT I-cache on CPU17 May 27 03:51:14.190580 kernel: GICv3: CPU17: found redistributor 40000 region 0:0x0000100100240000 May 27 03:51:14.190588 kernel: GICv3: CPU17: using allocated LPI pending table @0x00000800009a0000 May 27 03:51:14.190595 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190602 kernel: CPU17: Booted secondary processor 0x0000040000 [0x413fd0c1] May 27 03:51:14.190609 kernel: Detected PIPT I-cache on CPU18 May 27 03:51:14.190616 kernel: GICv3: CPU18: found redistributor 0 region 0:0x0000100100140000 May 27 03:51:14.190624 kernel: GICv3: CPU18: using allocated LPI pending table @0x00000800009b0000 May 27 03:51:14.190631 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190638 kernel: CPU18: Booted secondary processor 0x0000000000 [0x413fd0c1] May 27 03:51:14.190653 kernel: Detected PIPT I-cache on CPU19 May 27 03:51:14.190663 kernel: GICv3: CPU19: found redistributor 60000 region 0:0x00001001002c0000 May 27 03:51:14.190670 kernel: GICv3: CPU19: using allocated LPI pending table @0x00000800009c0000 May 27 03:51:14.190677 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190684 kernel: CPU19: Booted secondary processor 0x0000060000 [0x413fd0c1] May 27 03:51:14.190691 kernel: Detected PIPT I-cache on CPU20 May 27 03:51:14.190699 kernel: GICv3: CPU20: found redistributor 130000 region 0:0x0000100100600000 May 27 03:51:14.190706 kernel: GICv3: CPU20: using allocated LPI pending table @0x00000800009d0000 May 27 03:51:14.190713 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190720 kernel: CPU20: Booted secondary processor 0x0000130000 [0x413fd0c1] May 27 03:51:14.190729 kernel: Detected PIPT I-cache on CPU21 May 27 03:51:14.190736 kernel: GICv3: CPU21: found redistributor 1b0000 region 0:0x0000100100800000 May 27 03:51:14.190744 kernel: GICv3: CPU21: using allocated LPI pending table @0x00000800009e0000 May 27 03:51:14.190751 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190758 kernel: CPU21: Booted secondary processor 0x00001b0000 [0x413fd0c1] May 27 03:51:14.190765 kernel: Detected PIPT I-cache on CPU22 May 27 03:51:14.190772 kernel: GICv3: CPU22: found redistributor 150000 region 0:0x0000100100680000 May 27 03:51:14.190781 kernel: GICv3: CPU22: using allocated LPI pending table @0x00000800009f0000 May 27 03:51:14.190788 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190795 kernel: CPU22: Booted secondary processor 0x0000150000 [0x413fd0c1] May 27 03:51:14.190803 kernel: Detected PIPT I-cache on CPU23 May 27 03:51:14.190810 kernel: GICv3: CPU23: found redistributor 1d0000 region 0:0x0000100100880000 May 27 03:51:14.190817 kernel: GICv3: CPU23: using allocated LPI pending table @0x0000080000a00000 May 27 03:51:14.190824 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190831 kernel: CPU23: Booted secondary processor 0x00001d0000 [0x413fd0c1] May 27 03:51:14.190839 kernel: Detected PIPT I-cache on CPU24 May 27 03:51:14.190848 kernel: GICv3: CPU24: found redistributor 110000 region 0:0x0000100100580000 May 27 03:51:14.190855 kernel: GICv3: CPU24: using allocated LPI pending table @0x0000080000a10000 May 27 03:51:14.190862 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190870 kernel: CPU24: Booted secondary processor 0x0000110000 [0x413fd0c1] May 27 03:51:14.190877 kernel: Detected PIPT I-cache on CPU25 May 27 03:51:14.190884 kernel: GICv3: CPU25: found redistributor 190000 region 0:0x0000100100780000 May 27 03:51:14.190891 kernel: GICv3: CPU25: using allocated LPI pending table @0x0000080000a20000 May 27 03:51:14.190899 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190906 kernel: CPU25: Booted secondary processor 0x0000190000 [0x413fd0c1] May 27 03:51:14.190914 kernel: Detected PIPT I-cache on CPU26 May 27 03:51:14.190921 kernel: GICv3: CPU26: found redistributor 170000 region 0:0x0000100100700000 May 27 03:51:14.190929 kernel: GICv3: CPU26: using allocated LPI pending table @0x0000080000a30000 May 27 03:51:14.190936 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190943 kernel: CPU26: Booted secondary processor 0x0000170000 [0x413fd0c1] May 27 03:51:14.190950 kernel: Detected PIPT I-cache on CPU27 May 27 03:51:14.190958 kernel: GICv3: CPU27: found redistributor 1f0000 region 0:0x0000100100900000 May 27 03:51:14.190966 kernel: GICv3: CPU27: using allocated LPI pending table @0x0000080000a40000 May 27 03:51:14.190973 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.190980 kernel: CPU27: Booted secondary processor 0x00001f0000 [0x413fd0c1] May 27 03:51:14.190989 kernel: Detected PIPT I-cache on CPU28 May 27 03:51:14.190996 kernel: GICv3: CPU28: found redistributor b0000 region 0:0x0000100100400000 May 27 03:51:14.191003 kernel: GICv3: CPU28: using allocated LPI pending table @0x0000080000a50000 May 27 03:51:14.191011 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191018 kernel: CPU28: Booted secondary processor 0x00000b0000 [0x413fd0c1] May 27 03:51:14.191025 kernel: Detected PIPT I-cache on CPU29 May 27 03:51:14.191032 kernel: GICv3: CPU29: found redistributor 230000 region 0:0x0000100100a00000 May 27 03:51:14.191039 kernel: GICv3: CPU29: using allocated LPI pending table @0x0000080000a60000 May 27 03:51:14.191047 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191055 kernel: CPU29: Booted secondary processor 0x0000230000 [0x413fd0c1] May 27 03:51:14.191062 kernel: Detected PIPT I-cache on CPU30 May 27 03:51:14.191070 kernel: GICv3: CPU30: found redistributor d0000 region 0:0x0000100100480000 May 27 03:51:14.191077 kernel: GICv3: CPU30: using allocated LPI pending table @0x0000080000a70000 May 27 03:51:14.191084 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191091 kernel: CPU30: Booted secondary processor 0x00000d0000 [0x413fd0c1] May 27 03:51:14.191098 kernel: Detected PIPT I-cache on CPU31 May 27 03:51:14.191106 kernel: GICv3: CPU31: found redistributor 250000 region 0:0x0000100100a80000 May 27 03:51:14.191113 kernel: GICv3: CPU31: using allocated LPI pending table @0x0000080000a80000 May 27 03:51:14.191121 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191128 kernel: CPU31: Booted secondary processor 0x0000250000 [0x413fd0c1] May 27 03:51:14.191136 kernel: Detected PIPT I-cache on CPU32 May 27 03:51:14.191143 kernel: GICv3: CPU32: found redistributor 90000 region 0:0x0000100100380000 May 27 03:51:14.191150 kernel: GICv3: CPU32: using allocated LPI pending table @0x0000080000a90000 May 27 03:51:14.191157 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191165 kernel: CPU32: Booted secondary processor 0x0000090000 [0x413fd0c1] May 27 03:51:14.191172 kernel: Detected PIPT I-cache on CPU33 May 27 03:51:14.191179 kernel: GICv3: CPU33: found redistributor 210000 region 0:0x0000100100980000 May 27 03:51:14.191187 kernel: GICv3: CPU33: using allocated LPI pending table @0x0000080000aa0000 May 27 03:51:14.191195 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191202 kernel: CPU33: Booted secondary processor 0x0000210000 [0x413fd0c1] May 27 03:51:14.191212 kernel: Detected PIPT I-cache on CPU34 May 27 03:51:14.191219 kernel: GICv3: CPU34: found redistributor f0000 region 0:0x0000100100500000 May 27 03:51:14.191226 kernel: GICv3: CPU34: using allocated LPI pending table @0x0000080000ab0000 May 27 03:51:14.191233 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191240 kernel: CPU34: Booted secondary processor 0x00000f0000 [0x413fd0c1] May 27 03:51:14.191247 kernel: Detected PIPT I-cache on CPU35 May 27 03:51:14.191255 kernel: GICv3: CPU35: found redistributor 270000 region 0:0x0000100100b00000 May 27 03:51:14.191264 kernel: GICv3: CPU35: using allocated LPI pending table @0x0000080000ac0000 May 27 03:51:14.191272 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191279 kernel: CPU35: Booted secondary processor 0x0000270000 [0x413fd0c1] May 27 03:51:14.191288 kernel: Detected PIPT I-cache on CPU36 May 27 03:51:14.191295 kernel: GICv3: CPU36: found redistributor 30000 region 0:0x0000100100200000 May 27 03:51:14.191302 kernel: GICv3: CPU36: using allocated LPI pending table @0x0000080000ad0000 May 27 03:51:14.191310 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191317 kernel: CPU36: Booted secondary processor 0x0000030000 [0x413fd0c1] May 27 03:51:14.191324 kernel: Detected PIPT I-cache on CPU37 May 27 03:51:14.191332 kernel: GICv3: CPU37: found redistributor 50000 region 0:0x0000100100280000 May 27 03:51:14.191340 kernel: GICv3: CPU37: using allocated LPI pending table @0x0000080000ae0000 May 27 03:51:14.191348 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191355 kernel: CPU37: Booted secondary processor 0x0000050000 [0x413fd0c1] May 27 03:51:14.191362 kernel: Detected PIPT I-cache on CPU38 May 27 03:51:14.191369 kernel: GICv3: CPU38: found redistributor 10000 region 0:0x0000100100180000 May 27 03:51:14.191376 kernel: GICv3: CPU38: using allocated LPI pending table @0x0000080000af0000 May 27 03:51:14.191384 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191391 kernel: CPU38: Booted secondary processor 0x0000010000 [0x413fd0c1] May 27 03:51:14.191398 kernel: Detected PIPT I-cache on CPU39 May 27 03:51:14.191406 kernel: GICv3: CPU39: found redistributor 70000 region 0:0x0000100100300000 May 27 03:51:14.191414 kernel: GICv3: CPU39: using allocated LPI pending table @0x0000080000b00000 May 27 03:51:14.191421 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191428 kernel: CPU39: Booted secondary processor 0x0000070000 [0x413fd0c1] May 27 03:51:14.191435 kernel: Detected PIPT I-cache on CPU40 May 27 03:51:14.191443 kernel: GICv3: CPU40: found redistributor 120100 region 0:0x00001001005e0000 May 27 03:51:14.191450 kernel: GICv3: CPU40: using allocated LPI pending table @0x0000080000b10000 May 27 03:51:14.191457 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191466 kernel: CPU40: Booted secondary processor 0x0000120100 [0x413fd0c1] May 27 03:51:14.191473 kernel: Detected PIPT I-cache on CPU41 May 27 03:51:14.191480 kernel: GICv3: CPU41: found redistributor 1a0100 region 0:0x00001001007e0000 May 27 03:51:14.191488 kernel: GICv3: CPU41: using allocated LPI pending table @0x0000080000b20000 May 27 03:51:14.191495 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191502 kernel: CPU41: Booted secondary processor 0x00001a0100 [0x413fd0c1] May 27 03:51:14.191510 kernel: Detected PIPT I-cache on CPU42 May 27 03:51:14.191517 kernel: GICv3: CPU42: found redistributor 140100 region 0:0x0000100100660000 May 27 03:51:14.191524 kernel: GICv3: CPU42: using allocated LPI pending table @0x0000080000b30000 May 27 03:51:14.191533 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191540 kernel: CPU42: Booted secondary processor 0x0000140100 [0x413fd0c1] May 27 03:51:14.191547 kernel: Detected PIPT I-cache on CPU43 May 27 03:51:14.191554 kernel: GICv3: CPU43: found redistributor 1c0100 region 0:0x0000100100860000 May 27 03:51:14.191562 kernel: GICv3: CPU43: using allocated LPI pending table @0x0000080000b40000 May 27 03:51:14.191569 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191576 kernel: CPU43: Booted secondary processor 0x00001c0100 [0x413fd0c1] May 27 03:51:14.191583 kernel: Detected PIPT I-cache on CPU44 May 27 03:51:14.191590 kernel: GICv3: CPU44: found redistributor 100100 region 0:0x0000100100560000 May 27 03:51:14.191598 kernel: GICv3: CPU44: using allocated LPI pending table @0x0000080000b50000 May 27 03:51:14.191606 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191614 kernel: CPU44: Booted secondary processor 0x0000100100 [0x413fd0c1] May 27 03:51:14.191621 kernel: Detected PIPT I-cache on CPU45 May 27 03:51:14.191628 kernel: GICv3: CPU45: found redistributor 180100 region 0:0x0000100100760000 May 27 03:51:14.191635 kernel: GICv3: CPU45: using allocated LPI pending table @0x0000080000b60000 May 27 03:51:14.191642 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191650 kernel: CPU45: Booted secondary processor 0x0000180100 [0x413fd0c1] May 27 03:51:14.191657 kernel: Detected PIPT I-cache on CPU46 May 27 03:51:14.191664 kernel: GICv3: CPU46: found redistributor 160100 region 0:0x00001001006e0000 May 27 03:51:14.191672 kernel: GICv3: CPU46: using allocated LPI pending table @0x0000080000b70000 May 27 03:51:14.191680 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191687 kernel: CPU46: Booted secondary processor 0x0000160100 [0x413fd0c1] May 27 03:51:14.191694 kernel: Detected PIPT I-cache on CPU47 May 27 03:51:14.191702 kernel: GICv3: CPU47: found redistributor 1e0100 region 0:0x00001001008e0000 May 27 03:51:14.191709 kernel: GICv3: CPU47: using allocated LPI pending table @0x0000080000b80000 May 27 03:51:14.191716 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191724 kernel: CPU47: Booted secondary processor 0x00001e0100 [0x413fd0c1] May 27 03:51:14.191731 kernel: Detected PIPT I-cache on CPU48 May 27 03:51:14.191738 kernel: GICv3: CPU48: found redistributor a0100 region 0:0x00001001003e0000 May 27 03:51:14.191747 kernel: GICv3: CPU48: using allocated LPI pending table @0x0000080000b90000 May 27 03:51:14.191754 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191761 kernel: CPU48: Booted secondary processor 0x00000a0100 [0x413fd0c1] May 27 03:51:14.191768 kernel: Detected PIPT I-cache on CPU49 May 27 03:51:14.191776 kernel: GICv3: CPU49: found redistributor 220100 region 0:0x00001001009e0000 May 27 03:51:14.191784 kernel: GICv3: CPU49: using allocated LPI pending table @0x0000080000ba0000 May 27 03:51:14.191792 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191799 kernel: CPU49: Booted secondary processor 0x0000220100 [0x413fd0c1] May 27 03:51:14.191806 kernel: Detected PIPT I-cache on CPU50 May 27 03:51:14.191814 kernel: GICv3: CPU50: found redistributor c0100 region 0:0x0000100100460000 May 27 03:51:14.191822 kernel: GICv3: CPU50: using allocated LPI pending table @0x0000080000bb0000 May 27 03:51:14.191829 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191836 kernel: CPU50: Booted secondary processor 0x00000c0100 [0x413fd0c1] May 27 03:51:14.191843 kernel: Detected PIPT I-cache on CPU51 May 27 03:51:14.191850 kernel: GICv3: CPU51: found redistributor 240100 region 0:0x0000100100a60000 May 27 03:51:14.191858 kernel: GICv3: CPU51: using allocated LPI pending table @0x0000080000bc0000 May 27 03:51:14.191865 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191872 kernel: CPU51: Booted secondary processor 0x0000240100 [0x413fd0c1] May 27 03:51:14.191879 kernel: Detected PIPT I-cache on CPU52 May 27 03:51:14.191888 kernel: GICv3: CPU52: found redistributor 80100 region 0:0x0000100100360000 May 27 03:51:14.191895 kernel: GICv3: CPU52: using allocated LPI pending table @0x0000080000bd0000 May 27 03:51:14.191902 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191910 kernel: CPU52: Booted secondary processor 0x0000080100 [0x413fd0c1] May 27 03:51:14.191917 kernel: Detected PIPT I-cache on CPU53 May 27 03:51:14.191924 kernel: GICv3: CPU53: found redistributor 200100 region 0:0x0000100100960000 May 27 03:51:14.191931 kernel: GICv3: CPU53: using allocated LPI pending table @0x0000080000be0000 May 27 03:51:14.191939 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191946 kernel: CPU53: Booted secondary processor 0x0000200100 [0x413fd0c1] May 27 03:51:14.191954 kernel: Detected PIPT I-cache on CPU54 May 27 03:51:14.191962 kernel: GICv3: CPU54: found redistributor e0100 region 0:0x00001001004e0000 May 27 03:51:14.191969 kernel: GICv3: CPU54: using allocated LPI pending table @0x0000080000bf0000 May 27 03:51:14.191976 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.191984 kernel: CPU54: Booted secondary processor 0x00000e0100 [0x413fd0c1] May 27 03:51:14.191991 kernel: Detected PIPT I-cache on CPU55 May 27 03:51:14.191998 kernel: GICv3: CPU55: found redistributor 260100 region 0:0x0000100100ae0000 May 27 03:51:14.192005 kernel: GICv3: CPU55: using allocated LPI pending table @0x0000080000c00000 May 27 03:51:14.192013 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192020 kernel: CPU55: Booted secondary processor 0x0000260100 [0x413fd0c1] May 27 03:51:14.192028 kernel: Detected PIPT I-cache on CPU56 May 27 03:51:14.192036 kernel: GICv3: CPU56: found redistributor 20100 region 0:0x00001001001e0000 May 27 03:51:14.192043 kernel: GICv3: CPU56: using allocated LPI pending table @0x0000080000c10000 May 27 03:51:14.192051 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192058 kernel: CPU56: Booted secondary processor 0x0000020100 [0x413fd0c1] May 27 03:51:14.192065 kernel: Detected PIPT I-cache on CPU57 May 27 03:51:14.192074 kernel: GICv3: CPU57: found redistributor 40100 region 0:0x0000100100260000 May 27 03:51:14.192081 kernel: GICv3: CPU57: using allocated LPI pending table @0x0000080000c20000 May 27 03:51:14.192088 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192097 kernel: CPU57: Booted secondary processor 0x0000040100 [0x413fd0c1] May 27 03:51:14.192104 kernel: Detected PIPT I-cache on CPU58 May 27 03:51:14.192111 kernel: GICv3: CPU58: found redistributor 100 region 0:0x0000100100160000 May 27 03:51:14.192119 kernel: GICv3: CPU58: using allocated LPI pending table @0x0000080000c30000 May 27 03:51:14.192126 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192133 kernel: CPU58: Booted secondary processor 0x0000000100 [0x413fd0c1] May 27 03:51:14.192141 kernel: Detected PIPT I-cache on CPU59 May 27 03:51:14.192148 kernel: GICv3: CPU59: found redistributor 60100 region 0:0x00001001002e0000 May 27 03:51:14.192155 kernel: GICv3: CPU59: using allocated LPI pending table @0x0000080000c40000 May 27 03:51:14.192164 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192171 kernel: CPU59: Booted secondary processor 0x0000060100 [0x413fd0c1] May 27 03:51:14.192178 kernel: Detected PIPT I-cache on CPU60 May 27 03:51:14.192185 kernel: GICv3: CPU60: found redistributor 130100 region 0:0x0000100100620000 May 27 03:51:14.192193 kernel: GICv3: CPU60: using allocated LPI pending table @0x0000080000c50000 May 27 03:51:14.192200 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192209 kernel: CPU60: Booted secondary processor 0x0000130100 [0x413fd0c1] May 27 03:51:14.192217 kernel: Detected PIPT I-cache on CPU61 May 27 03:51:14.192224 kernel: GICv3: CPU61: found redistributor 1b0100 region 0:0x0000100100820000 May 27 03:51:14.192231 kernel: GICv3: CPU61: using allocated LPI pending table @0x0000080000c60000 May 27 03:51:14.192240 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192247 kernel: CPU61: Booted secondary processor 0x00001b0100 [0x413fd0c1] May 27 03:51:14.192255 kernel: Detected PIPT I-cache on CPU62 May 27 03:51:14.192262 kernel: GICv3: CPU62: found redistributor 150100 region 0:0x00001001006a0000 May 27 03:51:14.192269 kernel: GICv3: CPU62: using allocated LPI pending table @0x0000080000c70000 May 27 03:51:14.192276 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192284 kernel: CPU62: Booted secondary processor 0x0000150100 [0x413fd0c1] May 27 03:51:14.192291 kernel: Detected PIPT I-cache on CPU63 May 27 03:51:14.192298 kernel: GICv3: CPU63: found redistributor 1d0100 region 0:0x00001001008a0000 May 27 03:51:14.192306 kernel: GICv3: CPU63: using allocated LPI pending table @0x0000080000c80000 May 27 03:51:14.192314 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192321 kernel: CPU63: Booted secondary processor 0x00001d0100 [0x413fd0c1] May 27 03:51:14.192328 kernel: Detected PIPT I-cache on CPU64 May 27 03:51:14.192336 kernel: GICv3: CPU64: found redistributor 110100 region 0:0x00001001005a0000 May 27 03:51:14.192343 kernel: GICv3: CPU64: using allocated LPI pending table @0x0000080000c90000 May 27 03:51:14.192350 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192357 kernel: CPU64: Booted secondary processor 0x0000110100 [0x413fd0c1] May 27 03:51:14.192364 kernel: Detected PIPT I-cache on CPU65 May 27 03:51:14.192372 kernel: GICv3: CPU65: found redistributor 190100 region 0:0x00001001007a0000 May 27 03:51:14.192380 kernel: GICv3: CPU65: using allocated LPI pending table @0x0000080000ca0000 May 27 03:51:14.192388 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192395 kernel: CPU65: Booted secondary processor 0x0000190100 [0x413fd0c1] May 27 03:51:14.192402 kernel: Detected PIPT I-cache on CPU66 May 27 03:51:14.192409 kernel: GICv3: CPU66: found redistributor 170100 region 0:0x0000100100720000 May 27 03:51:14.192416 kernel: GICv3: CPU66: using allocated LPI pending table @0x0000080000cb0000 May 27 03:51:14.192424 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192431 kernel: CPU66: Booted secondary processor 0x0000170100 [0x413fd0c1] May 27 03:51:14.192438 kernel: Detected PIPT I-cache on CPU67 May 27 03:51:14.192447 kernel: GICv3: CPU67: found redistributor 1f0100 region 0:0x0000100100920000 May 27 03:51:14.192454 kernel: GICv3: CPU67: using allocated LPI pending table @0x0000080000cc0000 May 27 03:51:14.192461 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192469 kernel: CPU67: Booted secondary processor 0x00001f0100 [0x413fd0c1] May 27 03:51:14.192476 kernel: Detected PIPT I-cache on CPU68 May 27 03:51:14.192483 kernel: GICv3: CPU68: found redistributor b0100 region 0:0x0000100100420000 May 27 03:51:14.192490 kernel: GICv3: CPU68: using allocated LPI pending table @0x0000080000cd0000 May 27 03:51:14.192498 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192505 kernel: CPU68: Booted secondary processor 0x00000b0100 [0x413fd0c1] May 27 03:51:14.192512 kernel: Detected PIPT I-cache on CPU69 May 27 03:51:14.192521 kernel: GICv3: CPU69: found redistributor 230100 region 0:0x0000100100a20000 May 27 03:51:14.192529 kernel: GICv3: CPU69: using allocated LPI pending table @0x0000080000ce0000 May 27 03:51:14.192536 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192543 kernel: CPU69: Booted secondary processor 0x0000230100 [0x413fd0c1] May 27 03:51:14.192551 kernel: Detected PIPT I-cache on CPU70 May 27 03:51:14.192558 kernel: GICv3: CPU70: found redistributor d0100 region 0:0x00001001004a0000 May 27 03:51:14.192566 kernel: GICv3: CPU70: using allocated LPI pending table @0x0000080000cf0000 May 27 03:51:14.192573 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192580 kernel: CPU70: Booted secondary processor 0x00000d0100 [0x413fd0c1] May 27 03:51:14.192589 kernel: Detected PIPT I-cache on CPU71 May 27 03:51:14.192596 kernel: GICv3: CPU71: found redistributor 250100 region 0:0x0000100100aa0000 May 27 03:51:14.192603 kernel: GICv3: CPU71: using allocated LPI pending table @0x0000080000d00000 May 27 03:51:14.192610 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192618 kernel: CPU71: Booted secondary processor 0x0000250100 [0x413fd0c1] May 27 03:51:14.192625 kernel: Detected PIPT I-cache on CPU72 May 27 03:51:14.192632 kernel: GICv3: CPU72: found redistributor 90100 region 0:0x00001001003a0000 May 27 03:51:14.192640 kernel: GICv3: CPU72: using allocated LPI pending table @0x0000080000d10000 May 27 03:51:14.192647 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192654 kernel: CPU72: Booted secondary processor 0x0000090100 [0x413fd0c1] May 27 03:51:14.192662 kernel: Detected PIPT I-cache on CPU73 May 27 03:51:14.192670 kernel: GICv3: CPU73: found redistributor 210100 region 0:0x00001001009a0000 May 27 03:51:14.192677 kernel: GICv3: CPU73: using allocated LPI pending table @0x0000080000d20000 May 27 03:51:14.192684 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192692 kernel: CPU73: Booted secondary processor 0x0000210100 [0x413fd0c1] May 27 03:51:14.192699 kernel: Detected PIPT I-cache on CPU74 May 27 03:51:14.192706 kernel: GICv3: CPU74: found redistributor f0100 region 0:0x0000100100520000 May 27 03:51:14.192714 kernel: GICv3: CPU74: using allocated LPI pending table @0x0000080000d30000 May 27 03:51:14.192721 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192730 kernel: CPU74: Booted secondary processor 0x00000f0100 [0x413fd0c1] May 27 03:51:14.192737 kernel: Detected PIPT I-cache on CPU75 May 27 03:51:14.192745 kernel: GICv3: CPU75: found redistributor 270100 region 0:0x0000100100b20000 May 27 03:51:14.192752 kernel: GICv3: CPU75: using allocated LPI pending table @0x0000080000d40000 May 27 03:51:14.192759 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192766 kernel: CPU75: Booted secondary processor 0x0000270100 [0x413fd0c1] May 27 03:51:14.192774 kernel: Detected PIPT I-cache on CPU76 May 27 03:51:14.192781 kernel: GICv3: CPU76: found redistributor 30100 region 0:0x0000100100220000 May 27 03:51:14.192788 kernel: GICv3: CPU76: using allocated LPI pending table @0x0000080000d50000 May 27 03:51:14.192796 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192804 kernel: CPU76: Booted secondary processor 0x0000030100 [0x413fd0c1] May 27 03:51:14.192812 kernel: Detected PIPT I-cache on CPU77 May 27 03:51:14.192819 kernel: GICv3: CPU77: found redistributor 50100 region 0:0x00001001002a0000 May 27 03:51:14.192826 kernel: GICv3: CPU77: using allocated LPI pending table @0x0000080000d60000 May 27 03:51:14.192834 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192841 kernel: CPU77: Booted secondary processor 0x0000050100 [0x413fd0c1] May 27 03:51:14.192848 kernel: Detected PIPT I-cache on CPU78 May 27 03:51:14.192856 kernel: GICv3: CPU78: found redistributor 10100 region 0:0x00001001001a0000 May 27 03:51:14.192863 kernel: GICv3: CPU78: using allocated LPI pending table @0x0000080000d70000 May 27 03:51:14.192872 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192879 kernel: CPU78: Booted secondary processor 0x0000010100 [0x413fd0c1] May 27 03:51:14.192886 kernel: Detected PIPT I-cache on CPU79 May 27 03:51:14.192894 kernel: GICv3: CPU79: found redistributor 70100 region 0:0x0000100100320000 May 27 03:51:14.192901 kernel: GICv3: CPU79: using allocated LPI pending table @0x0000080000d80000 May 27 03:51:14.192908 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 03:51:14.192915 kernel: CPU79: Booted secondary processor 0x0000070100 [0x413fd0c1] May 27 03:51:14.192922 kernel: smp: Brought up 1 node, 80 CPUs May 27 03:51:14.192930 kernel: SMP: Total of 80 processors activated. May 27 03:51:14.192937 kernel: CPU: All CPU(s) started at EL2 May 27 03:51:14.192946 kernel: CPU features: detected: 32-bit EL0 Support May 27 03:51:14.192953 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence May 27 03:51:14.192960 kernel: CPU features: detected: Common not Private translations May 27 03:51:14.192967 kernel: CPU features: detected: CRC32 instructions May 27 03:51:14.192975 kernel: CPU features: detected: Enhanced Virtualization Traps May 27 03:51:14.192982 kernel: CPU features: detected: RCpc load-acquire (LDAPR) May 27 03:51:14.192989 kernel: CPU features: detected: LSE atomic instructions May 27 03:51:14.192997 kernel: CPU features: detected: Privileged Access Never May 27 03:51:14.193004 kernel: CPU features: detected: RAS Extension Support May 27 03:51:14.193013 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) May 27 03:51:14.193020 kernel: alternatives: applying system-wide alternatives May 27 03:51:14.193027 kernel: CPU features: detected: Hardware dirty bit management on CPU0-79 May 27 03:51:14.193035 kernel: Memory: 262860252K/268174336K available (11072K kernel code, 2276K rwdata, 8936K rodata, 39424K init, 1034K bss, 5254404K reserved, 0K cma-reserved) May 27 03:51:14.193042 kernel: devtmpfs: initialized May 27 03:51:14.193050 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 03:51:14.193057 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) May 27 03:51:14.193065 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL May 27 03:51:14.193073 kernel: 0 pages in range for non-PLT usage May 27 03:51:14.193080 kernel: 508544 pages in range for PLT usage May 27 03:51:14.193087 kernel: pinctrl core: initialized pinctrl subsystem May 27 03:51:14.193095 kernel: SMBIOS 3.4.0 present. May 27 03:51:14.193102 kernel: DMI: GIGABYTE R272-P30-JG/MP32-AR0-JG, BIOS F17a (SCP: 1.07.20210713) 07/22/2021 May 27 03:51:14.193109 kernel: DMI: Memory slots populated: 8/16 May 27 03:51:14.193116 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 03:51:14.193124 kernel: DMA: preallocated 4096 KiB GFP_KERNEL pool for atomic allocations May 27 03:51:14.193131 kernel: DMA: preallocated 4096 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 27 03:51:14.193140 kernel: DMA: preallocated 4096 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 27 03:51:14.193147 kernel: audit: initializing netlink subsys (disabled) May 27 03:51:14.193154 kernel: audit: type=2000 audit(0.157:1): state=initialized audit_enabled=0 res=1 May 27 03:51:14.193162 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 03:51:14.193169 kernel: cpuidle: using governor menu May 27 03:51:14.193176 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 27 03:51:14.193183 kernel: ASID allocator initialised with 32768 entries May 27 03:51:14.193191 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 03:51:14.193198 kernel: Serial: AMBA PL011 UART driver May 27 03:51:14.193208 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 03:51:14.193216 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 27 03:51:14.193224 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 27 03:51:14.193231 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 27 03:51:14.193238 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 03:51:14.193246 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 27 03:51:14.193253 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 27 03:51:14.193260 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 27 03:51:14.193267 kernel: ACPI: Added _OSI(Module Device) May 27 03:51:14.193276 kernel: ACPI: Added _OSI(Processor Device) May 27 03:51:14.193283 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 03:51:14.193291 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 03:51:14.193298 kernel: ACPI: 2 ACPI AML tables successfully acquired and loaded May 27 03:51:14.193305 kernel: ACPI: Interpreter enabled May 27 03:51:14.193312 kernel: ACPI: Using GIC for interrupt routing May 27 03:51:14.193319 kernel: ACPI: MCFG table detected, 8 entries May 27 03:51:14.193327 kernel: ACPI: IORT: SMMU-v3[33ffe0000000] Mapped to Proximity domain 0 May 27 03:51:14.193334 kernel: ACPI: IORT: SMMU-v3[37ffe0000000] Mapped to Proximity domain 0 May 27 03:51:14.193342 kernel: ACPI: IORT: SMMU-v3[3bffe0000000] Mapped to Proximity domain 0 May 27 03:51:14.193349 kernel: ACPI: IORT: SMMU-v3[3fffe0000000] Mapped to Proximity domain 0 May 27 03:51:14.193357 kernel: ACPI: IORT: SMMU-v3[23ffe0000000] Mapped to Proximity domain 0 May 27 03:51:14.193364 kernel: ACPI: IORT: SMMU-v3[27ffe0000000] Mapped to Proximity domain 0 May 27 03:51:14.193371 kernel: ACPI: IORT: SMMU-v3[2bffe0000000] Mapped to Proximity domain 0 May 27 03:51:14.193378 kernel: ACPI: IORT: SMMU-v3[2fffe0000000] Mapped to Proximity domain 0 May 27 03:51:14.193386 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x100002600000 (irq = 19, base_baud = 0) is a SBSA May 27 03:51:14.193393 kernel: printk: legacy console [ttyAMA0] enabled May 27 03:51:14.193401 kernel: ARMH0011:01: ttyAMA1 at MMIO 0x100002620000 (irq = 20, base_baud = 0) is a SBSA May 27 03:51:14.193409 kernel: ACPI: PCI Root Bridge [PCI1] (domain 000d [bus 00-ff]) May 27 03:51:14.193541 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:51:14.193607 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 03:51:14.193666 kernel: acpi PNP0A08:00: _OSC: OS now controls [AER PCIeCapability] May 27 03:51:14.193725 kernel: acpi PNP0A08:00: MCFG quirk: ECAM at [mem 0x37fff0000000-0x37ffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 03:51:14.193782 kernel: acpi PNP0A08:00: ECAM area [mem 0x37fff0000000-0x37ffffffffff] reserved by PNP0C02:00 May 27 03:51:14.193839 kernel: acpi PNP0A08:00: ECAM at [mem 0x37fff0000000-0x37ffffffffff] for [bus 00-ff] May 27 03:51:14.193850 kernel: PCI host bridge to bus 000d:00 May 27 03:51:14.193917 kernel: pci_bus 000d:00: root bus resource [mem 0x50000000-0x5fffffff window] May 27 03:51:14.193971 kernel: pci_bus 000d:00: root bus resource [mem 0x340000000000-0x37ffdfffffff window] May 27 03:51:14.194024 kernel: pci_bus 000d:00: root bus resource [bus 00-ff] May 27 03:51:14.194102 kernel: pci 000d:00:00.0: [1def:e100] type 00 class 0x060000 conventional PCI endpoint May 27 03:51:14.194173 kernel: pci 000d:00:01.0: [1def:e101] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.194243 kernel: pci 000d:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.194305 kernel: pci 000d:00:01.0: enabling Extended Tags May 27 03:51:14.194365 kernel: pci 000d:00:01.0: supports D1 D2 May 27 03:51:14.194424 kernel: pci 000d:00:01.0: PME# supported from D0 D1 D3hot May 27 03:51:14.194493 kernel: pci 000d:00:02.0: [1def:e102] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.194553 kernel: pci 000d:00:02.0: PCI bridge to [bus 02] May 27 03:51:14.194613 kernel: pci 000d:00:02.0: supports D1 D2 May 27 03:51:14.194675 kernel: pci 000d:00:02.0: PME# supported from D0 D1 D3hot May 27 03:51:14.194743 kernel: pci 000d:00:03.0: [1def:e103] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.194803 kernel: pci 000d:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.194863 kernel: pci 000d:00:03.0: supports D1 D2 May 27 03:51:14.194922 kernel: pci 000d:00:03.0: PME# supported from D0 D1 D3hot May 27 03:51:14.194990 kernel: pci 000d:00:04.0: [1def:e104] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.195050 kernel: pci 000d:00:04.0: PCI bridge to [bus 04] May 27 03:51:14.195112 kernel: pci 000d:00:04.0: supports D1 D2 May 27 03:51:14.195172 kernel: pci 000d:00:04.0: PME# supported from D0 D1 D3hot May 27 03:51:14.195181 kernel: acpiphp: Slot [1] registered May 27 03:51:14.195188 kernel: acpiphp: Slot [2] registered May 27 03:51:14.195196 kernel: acpiphp: Slot [3] registered May 27 03:51:14.195203 kernel: acpiphp: Slot [4] registered May 27 03:51:14.195310 kernel: pci_bus 000d:00: on NUMA node 0 May 27 03:51:14.195373 kernel: pci 000d:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 03:51:14.195436 kernel: pci 000d:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.195495 kernel: pci 000d:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.195555 kernel: pci 000d:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 03:51:14.195615 kernel: pci 000d:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.195674 kernel: pci 000d:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.195735 kernel: pci 000d:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 03:51:14.195796 kernel: pci 000d:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.195857 kernel: pci 000d:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.195917 kernel: pci 000d:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 03:51:14.195976 kernel: pci 000d:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.196036 kernel: pci 000d:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.196096 kernel: pci 000d:00:01.0: bridge window [mem 0x50000000-0x501fffff]: assigned May 27 03:51:14.196155 kernel: pci 000d:00:01.0: bridge window [mem 0x340000000000-0x3400001fffff 64bit pref]: assigned May 27 03:51:14.196219 kernel: pci 000d:00:02.0: bridge window [mem 0x50200000-0x503fffff]: assigned May 27 03:51:14.196281 kernel: pci 000d:00:02.0: bridge window [mem 0x340000200000-0x3400003fffff 64bit pref]: assigned May 27 03:51:14.196340 kernel: pci 000d:00:03.0: bridge window [mem 0x50400000-0x505fffff]: assigned May 27 03:51:14.196400 kernel: pci 000d:00:03.0: bridge window [mem 0x340000400000-0x3400005fffff 64bit pref]: assigned May 27 03:51:14.196460 kernel: pci 000d:00:04.0: bridge window [mem 0x50600000-0x507fffff]: assigned May 27 03:51:14.196519 kernel: pci 000d:00:04.0: bridge window [mem 0x340000600000-0x3400007fffff 64bit pref]: assigned May 27 03:51:14.196579 kernel: pci 000d:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.196637 kernel: pci 000d:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.196698 kernel: pci 000d:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.196758 kernel: pci 000d:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.196818 kernel: pci 000d:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.196877 kernel: pci 000d:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.196937 kernel: pci 000d:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.196996 kernel: pci 000d:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.197056 kernel: pci 000d:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.197117 kernel: pci 000d:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.197176 kernel: pci 000d:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.197238 kernel: pci 000d:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.197299 kernel: pci 000d:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.197359 kernel: pci 000d:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.197419 kernel: pci 000d:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.197477 kernel: pci 000d:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.197537 kernel: pci 000d:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.197596 kernel: pci 000d:00:01.0: bridge window [mem 0x50000000-0x501fffff] May 27 03:51:14.197658 kernel: pci 000d:00:01.0: bridge window [mem 0x340000000000-0x3400001fffff 64bit pref] May 27 03:51:14.197718 kernel: pci 000d:00:02.0: PCI bridge to [bus 02] May 27 03:51:14.197778 kernel: pci 000d:00:02.0: bridge window [mem 0x50200000-0x503fffff] May 27 03:51:14.197838 kernel: pci 000d:00:02.0: bridge window [mem 0x340000200000-0x3400003fffff 64bit pref] May 27 03:51:14.197897 kernel: pci 000d:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.197957 kernel: pci 000d:00:03.0: bridge window [mem 0x50400000-0x505fffff] May 27 03:51:14.198018 kernel: pci 000d:00:03.0: bridge window [mem 0x340000400000-0x3400005fffff 64bit pref] May 27 03:51:14.198078 kernel: pci 000d:00:04.0: PCI bridge to [bus 04] May 27 03:51:14.198137 kernel: pci 000d:00:04.0: bridge window [mem 0x50600000-0x507fffff] May 27 03:51:14.198197 kernel: pci 000d:00:04.0: bridge window [mem 0x340000600000-0x3400007fffff 64bit pref] May 27 03:51:14.198255 kernel: pci_bus 000d:00: resource 4 [mem 0x50000000-0x5fffffff window] May 27 03:51:14.198308 kernel: pci_bus 000d:00: resource 5 [mem 0x340000000000-0x37ffdfffffff window] May 27 03:51:14.198372 kernel: pci_bus 000d:01: resource 1 [mem 0x50000000-0x501fffff] May 27 03:51:14.198430 kernel: pci_bus 000d:01: resource 2 [mem 0x340000000000-0x3400001fffff 64bit pref] May 27 03:51:14.198495 kernel: pci_bus 000d:02: resource 1 [mem 0x50200000-0x503fffff] May 27 03:51:14.198550 kernel: pci_bus 000d:02: resource 2 [mem 0x340000200000-0x3400003fffff 64bit pref] May 27 03:51:14.198623 kernel: pci_bus 000d:03: resource 1 [mem 0x50400000-0x505fffff] May 27 03:51:14.198680 kernel: pci_bus 000d:03: resource 2 [mem 0x340000400000-0x3400005fffff 64bit pref] May 27 03:51:14.198744 kernel: pci_bus 000d:04: resource 1 [mem 0x50600000-0x507fffff] May 27 03:51:14.198803 kernel: pci_bus 000d:04: resource 2 [mem 0x340000600000-0x3400007fffff 64bit pref] May 27 03:51:14.198813 kernel: ACPI: PCI Root Bridge [PCI3] (domain 0000 [bus 00-ff]) May 27 03:51:14.198877 kernel: acpi PNP0A08:01: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:51:14.198936 kernel: acpi PNP0A08:01: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 03:51:14.198993 kernel: acpi PNP0A08:01: _OSC: OS now controls [AER PCIeCapability] May 27 03:51:14.199050 kernel: acpi PNP0A08:01: MCFG quirk: ECAM at [mem 0x3ffff0000000-0x3fffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 03:51:14.199109 kernel: acpi PNP0A08:01: ECAM area [mem 0x3ffff0000000-0x3fffffffffff] reserved by PNP0C02:00 May 27 03:51:14.199165 kernel: acpi PNP0A08:01: ECAM at [mem 0x3ffff0000000-0x3fffffffffff] for [bus 00-ff] May 27 03:51:14.199175 kernel: PCI host bridge to bus 0000:00 May 27 03:51:14.199241 kernel: pci_bus 0000:00: root bus resource [mem 0x70000000-0x7fffffff window] May 27 03:51:14.199297 kernel: pci_bus 0000:00: root bus resource [mem 0x3c0000000000-0x3fffdfffffff window] May 27 03:51:14.199351 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 27 03:51:14.199421 kernel: pci 0000:00:00.0: [1def:e100] type 00 class 0x060000 conventional PCI endpoint May 27 03:51:14.199491 kernel: pci 0000:00:01.0: [1def:e101] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.199551 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.199611 kernel: pci 0000:00:01.0: enabling Extended Tags May 27 03:51:14.199670 kernel: pci 0000:00:01.0: supports D1 D2 May 27 03:51:14.199729 kernel: pci 0000:00:01.0: PME# supported from D0 D1 D3hot May 27 03:51:14.199796 kernel: pci 0000:00:02.0: [1def:e102] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.199858 kernel: pci 0000:00:02.0: PCI bridge to [bus 02] May 27 03:51:14.199920 kernel: pci 0000:00:02.0: supports D1 D2 May 27 03:51:14.199978 kernel: pci 0000:00:02.0: PME# supported from D0 D1 D3hot May 27 03:51:14.200044 kernel: pci 0000:00:03.0: [1def:e103] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.200104 kernel: pci 0000:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.200163 kernel: pci 0000:00:03.0: supports D1 D2 May 27 03:51:14.200226 kernel: pci 0000:00:03.0: PME# supported from D0 D1 D3hot May 27 03:51:14.200293 kernel: pci 0000:00:04.0: [1def:e104] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.200355 kernel: pci 0000:00:04.0: PCI bridge to [bus 04] May 27 03:51:14.200415 kernel: pci 0000:00:04.0: supports D1 D2 May 27 03:51:14.200474 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D3hot May 27 03:51:14.200483 kernel: acpiphp: Slot [1-1] registered May 27 03:51:14.200490 kernel: acpiphp: Slot [2-1] registered May 27 03:51:14.200498 kernel: acpiphp: Slot [3-1] registered May 27 03:51:14.200505 kernel: acpiphp: Slot [4-1] registered May 27 03:51:14.200556 kernel: pci_bus 0000:00: on NUMA node 0 May 27 03:51:14.200616 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 03:51:14.200678 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.200738 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.200797 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 03:51:14.200856 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.200915 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.200974 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 03:51:14.201036 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.201095 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.201155 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 03:51:14.201218 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.201280 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.201340 kernel: pci 0000:00:01.0: bridge window [mem 0x70000000-0x701fffff]: assigned May 27 03:51:14.201399 kernel: pci 0000:00:01.0: bridge window [mem 0x3c0000000000-0x3c00001fffff 64bit pref]: assigned May 27 03:51:14.201460 kernel: pci 0000:00:02.0: bridge window [mem 0x70200000-0x703fffff]: assigned May 27 03:51:14.201519 kernel: pci 0000:00:02.0: bridge window [mem 0x3c0000200000-0x3c00003fffff 64bit pref]: assigned May 27 03:51:14.201578 kernel: pci 0000:00:03.0: bridge window [mem 0x70400000-0x705fffff]: assigned May 27 03:51:14.201637 kernel: pci 0000:00:03.0: bridge window [mem 0x3c0000400000-0x3c00005fffff 64bit pref]: assigned May 27 03:51:14.201696 kernel: pci 0000:00:04.0: bridge window [mem 0x70600000-0x707fffff]: assigned May 27 03:51:14.201755 kernel: pci 0000:00:04.0: bridge window [mem 0x3c0000600000-0x3c00007fffff 64bit pref]: assigned May 27 03:51:14.201814 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.201873 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.201935 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.201994 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.202053 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.202113 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.202171 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.202234 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.202294 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.202352 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.202414 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.202473 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.202532 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.202591 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.202650 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.202711 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.202769 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.202829 kernel: pci 0000:00:01.0: bridge window [mem 0x70000000-0x701fffff] May 27 03:51:14.202888 kernel: pci 0000:00:01.0: bridge window [mem 0x3c0000000000-0x3c00001fffff 64bit pref] May 27 03:51:14.202948 kernel: pci 0000:00:02.0: PCI bridge to [bus 02] May 27 03:51:14.203007 kernel: pci 0000:00:02.0: bridge window [mem 0x70200000-0x703fffff] May 27 03:51:14.203065 kernel: pci 0000:00:02.0: bridge window [mem 0x3c0000200000-0x3c00003fffff 64bit pref] May 27 03:51:14.203127 kernel: pci 0000:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.203186 kernel: pci 0000:00:03.0: bridge window [mem 0x70400000-0x705fffff] May 27 03:51:14.203248 kernel: pci 0000:00:03.0: bridge window [mem 0x3c0000400000-0x3c00005fffff 64bit pref] May 27 03:51:14.203308 kernel: pci 0000:00:04.0: PCI bridge to [bus 04] May 27 03:51:14.203367 kernel: pci 0000:00:04.0: bridge window [mem 0x70600000-0x707fffff] May 27 03:51:14.203426 kernel: pci 0000:00:04.0: bridge window [mem 0x3c0000600000-0x3c00007fffff 64bit pref] May 27 03:51:14.203482 kernel: pci_bus 0000:00: resource 4 [mem 0x70000000-0x7fffffff window] May 27 03:51:14.203534 kernel: pci_bus 0000:00: resource 5 [mem 0x3c0000000000-0x3fffdfffffff window] May 27 03:51:14.203597 kernel: pci_bus 0000:01: resource 1 [mem 0x70000000-0x701fffff] May 27 03:51:14.203653 kernel: pci_bus 0000:01: resource 2 [mem 0x3c0000000000-0x3c00001fffff 64bit pref] May 27 03:51:14.203715 kernel: pci_bus 0000:02: resource 1 [mem 0x70200000-0x703fffff] May 27 03:51:14.203771 kernel: pci_bus 0000:02: resource 2 [mem 0x3c0000200000-0x3c00003fffff 64bit pref] May 27 03:51:14.203841 kernel: pci_bus 0000:03: resource 1 [mem 0x70400000-0x705fffff] May 27 03:51:14.203898 kernel: pci_bus 0000:03: resource 2 [mem 0x3c0000400000-0x3c00005fffff 64bit pref] May 27 03:51:14.203959 kernel: pci_bus 0000:04: resource 1 [mem 0x70600000-0x707fffff] May 27 03:51:14.204015 kernel: pci_bus 0000:04: resource 2 [mem 0x3c0000600000-0x3c00007fffff 64bit pref] May 27 03:51:14.204024 kernel: ACPI: PCI Root Bridge [PCI7] (domain 0005 [bus 00-ff]) May 27 03:51:14.204088 kernel: acpi PNP0A08:02: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:51:14.204146 kernel: acpi PNP0A08:02: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 03:51:14.204208 kernel: acpi PNP0A08:02: _OSC: OS now controls [AER PCIeCapability] May 27 03:51:14.204266 kernel: acpi PNP0A08:02: MCFG quirk: ECAM at [mem 0x2ffff0000000-0x2fffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 03:51:14.204323 kernel: acpi PNP0A08:02: ECAM area [mem 0x2ffff0000000-0x2fffffffffff] reserved by PNP0C02:00 May 27 03:51:14.204379 kernel: acpi PNP0A08:02: ECAM at [mem 0x2ffff0000000-0x2fffffffffff] for [bus 00-ff] May 27 03:51:14.204389 kernel: PCI host bridge to bus 0005:00 May 27 03:51:14.204448 kernel: pci_bus 0005:00: root bus resource [mem 0x30000000-0x3fffffff window] May 27 03:51:14.204501 kernel: pci_bus 0005:00: root bus resource [mem 0x2c0000000000-0x2fffdfffffff window] May 27 03:51:14.204556 kernel: pci_bus 0005:00: root bus resource [bus 00-ff] May 27 03:51:14.204621 kernel: pci 0005:00:00.0: [1def:e110] type 00 class 0x060000 conventional PCI endpoint May 27 03:51:14.204689 kernel: pci 0005:00:01.0: [1def:e111] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.204750 kernel: pci 0005:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.204810 kernel: pci 0005:00:01.0: supports D1 D2 May 27 03:51:14.204868 kernel: pci 0005:00:01.0: PME# supported from D0 D1 D3hot May 27 03:51:14.204934 kernel: pci 0005:00:03.0: [1def:e113] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.204996 kernel: pci 0005:00:03.0: PCI bridge to [bus 02] May 27 03:51:14.205054 kernel: pci 0005:00:03.0: supports D1 D2 May 27 03:51:14.205113 kernel: pci 0005:00:03.0: PME# supported from D0 D1 D3hot May 27 03:51:14.205179 kernel: pci 0005:00:05.0: [1def:e115] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.205244 kernel: pci 0005:00:05.0: PCI bridge to [bus 03] May 27 03:51:14.205305 kernel: pci 0005:00:05.0: bridge window [mem 0x30100000-0x301fffff] May 27 03:51:14.205363 kernel: pci 0005:00:05.0: supports D1 D2 May 27 03:51:14.205425 kernel: pci 0005:00:05.0: PME# supported from D0 D1 D3hot May 27 03:51:14.205490 kernel: pci 0005:00:07.0: [1def:e117] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.205550 kernel: pci 0005:00:07.0: PCI bridge to [bus 04] May 27 03:51:14.205609 kernel: pci 0005:00:07.0: bridge window [mem 0x30000000-0x300fffff] May 27 03:51:14.205668 kernel: pci 0005:00:07.0: supports D1 D2 May 27 03:51:14.205726 kernel: pci 0005:00:07.0: PME# supported from D0 D1 D3hot May 27 03:51:14.205736 kernel: acpiphp: Slot [1-2] registered May 27 03:51:14.205745 kernel: acpiphp: Slot [2-2] registered May 27 03:51:14.205811 kernel: pci 0005:03:00.0: [144d:a808] type 00 class 0x010802 PCIe Endpoint May 27 03:51:14.205876 kernel: pci 0005:03:00.0: BAR 0 [mem 0x30110000-0x30113fff 64bit] May 27 03:51:14.205938 kernel: pci 0005:03:00.0: ROM [mem 0x30100000-0x3010ffff pref] May 27 03:51:14.206005 kernel: pci 0005:04:00.0: [144d:a808] type 00 class 0x010802 PCIe Endpoint May 27 03:51:14.206068 kernel: pci 0005:04:00.0: BAR 0 [mem 0x30010000-0x30013fff 64bit] May 27 03:51:14.206133 kernel: pci 0005:04:00.0: ROM [mem 0x30000000-0x3000ffff pref] May 27 03:51:14.206190 kernel: pci_bus 0005:00: on NUMA node 0 May 27 03:51:14.206254 kernel: pci 0005:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 03:51:14.206318 kernel: pci 0005:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.206379 kernel: pci 0005:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.206440 kernel: pci 0005:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 03:51:14.206501 kernel: pci 0005:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.206560 kernel: pci 0005:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.206623 kernel: pci 0005:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 03:51:14.206684 kernel: pci 0005:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.206745 kernel: pci 0005:00:05.0: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 May 27 03:51:14.206806 kernel: pci 0005:00:07.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 03:51:14.206865 kernel: pci 0005:00:07.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.206924 kernel: pci 0005:00:07.0: bridge window [mem 0x00100000-0x001fffff] to [bus 04] add_size 100000 add_align 100000 May 27 03:51:14.206984 kernel: pci 0005:00:01.0: bridge window [mem 0x30000000-0x301fffff]: assigned May 27 03:51:14.207045 kernel: pci 0005:00:01.0: bridge window [mem 0x2c0000000000-0x2c00001fffff 64bit pref]: assigned May 27 03:51:14.207104 kernel: pci 0005:00:03.0: bridge window [mem 0x30200000-0x303fffff]: assigned May 27 03:51:14.207163 kernel: pci 0005:00:03.0: bridge window [mem 0x2c0000200000-0x2c00003fffff 64bit pref]: assigned May 27 03:51:14.207228 kernel: pci 0005:00:05.0: bridge window [mem 0x30400000-0x305fffff]: assigned May 27 03:51:14.207288 kernel: pci 0005:00:05.0: bridge window [mem 0x2c0000400000-0x2c00005fffff 64bit pref]: assigned May 27 03:51:14.207347 kernel: pci 0005:00:07.0: bridge window [mem 0x30600000-0x307fffff]: assigned May 27 03:51:14.207406 kernel: pci 0005:00:07.0: bridge window [mem 0x2c0000600000-0x2c00007fffff 64bit pref]: assigned May 27 03:51:14.207466 kernel: pci 0005:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.207527 kernel: pci 0005:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.207586 kernel: pci 0005:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.207645 kernel: pci 0005:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.207704 kernel: pci 0005:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.207764 kernel: pci 0005:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.207823 kernel: pci 0005:00:07.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.207882 kernel: pci 0005:00:07.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.207941 kernel: pci 0005:00:07.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.208022 kernel: pci 0005:00:07.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.208084 kernel: pci 0005:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.208144 kernel: pci 0005:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.208203 kernel: pci 0005:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.208274 kernel: pci 0005:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.208336 kernel: pci 0005:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.208398 kernel: pci 0005:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.208459 kernel: pci 0005:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.208519 kernel: pci 0005:00:01.0: bridge window [mem 0x30000000-0x301fffff] May 27 03:51:14.208578 kernel: pci 0005:00:01.0: bridge window [mem 0x2c0000000000-0x2c00001fffff 64bit pref] May 27 03:51:14.208637 kernel: pci 0005:00:03.0: PCI bridge to [bus 02] May 27 03:51:14.208697 kernel: pci 0005:00:03.0: bridge window [mem 0x30200000-0x303fffff] May 27 03:51:14.208756 kernel: pci 0005:00:03.0: bridge window [mem 0x2c0000200000-0x2c00003fffff 64bit pref] May 27 03:51:14.208819 kernel: pci 0005:03:00.0: ROM [mem 0x30400000-0x3040ffff pref]: assigned May 27 03:51:14.208883 kernel: pci 0005:03:00.0: BAR 0 [mem 0x30410000-0x30413fff 64bit]: assigned May 27 03:51:14.208943 kernel: pci 0005:00:05.0: PCI bridge to [bus 03] May 27 03:51:14.209003 kernel: pci 0005:00:05.0: bridge window [mem 0x30400000-0x305fffff] May 27 03:51:14.209062 kernel: pci 0005:00:05.0: bridge window [mem 0x2c0000400000-0x2c00005fffff 64bit pref] May 27 03:51:14.209124 kernel: pci 0005:04:00.0: ROM [mem 0x30600000-0x3060ffff pref]: assigned May 27 03:51:14.209185 kernel: pci 0005:04:00.0: BAR 0 [mem 0x30610000-0x30613fff 64bit]: assigned May 27 03:51:14.209248 kernel: pci 0005:00:07.0: PCI bridge to [bus 04] May 27 03:51:14.209308 kernel: pci 0005:00:07.0: bridge window [mem 0x30600000-0x307fffff] May 27 03:51:14.209370 kernel: pci 0005:00:07.0: bridge window [mem 0x2c0000600000-0x2c00007fffff 64bit pref] May 27 03:51:14.209426 kernel: pci_bus 0005:00: resource 4 [mem 0x30000000-0x3fffffff window] May 27 03:51:14.209478 kernel: pci_bus 0005:00: resource 5 [mem 0x2c0000000000-0x2fffdfffffff window] May 27 03:51:14.209543 kernel: pci_bus 0005:01: resource 1 [mem 0x30000000-0x301fffff] May 27 03:51:14.209598 kernel: pci_bus 0005:01: resource 2 [mem 0x2c0000000000-0x2c00001fffff 64bit pref] May 27 03:51:14.209671 kernel: pci_bus 0005:02: resource 1 [mem 0x30200000-0x303fffff] May 27 03:51:14.209729 kernel: pci_bus 0005:02: resource 2 [mem 0x2c0000200000-0x2c00003fffff 64bit pref] May 27 03:51:14.209791 kernel: pci_bus 0005:03: resource 1 [mem 0x30400000-0x305fffff] May 27 03:51:14.209847 kernel: pci_bus 0005:03: resource 2 [mem 0x2c0000400000-0x2c00005fffff 64bit pref] May 27 03:51:14.209910 kernel: pci_bus 0005:04: resource 1 [mem 0x30600000-0x307fffff] May 27 03:51:14.209967 kernel: pci_bus 0005:04: resource 2 [mem 0x2c0000600000-0x2c00007fffff 64bit pref] May 27 03:51:14.209976 kernel: ACPI: PCI Root Bridge [PCI5] (domain 0003 [bus 00-ff]) May 27 03:51:14.210043 kernel: acpi PNP0A08:03: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:51:14.210102 kernel: acpi PNP0A08:03: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 03:51:14.210160 kernel: acpi PNP0A08:03: _OSC: OS now controls [AER PCIeCapability] May 27 03:51:14.210220 kernel: acpi PNP0A08:03: MCFG quirk: ECAM at [mem 0x27fff0000000-0x27ffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 03:51:14.210279 kernel: acpi PNP0A08:03: ECAM area [mem 0x27fff0000000-0x27ffffffffff] reserved by PNP0C02:00 May 27 03:51:14.210336 kernel: acpi PNP0A08:03: ECAM at [mem 0x27fff0000000-0x27ffffffffff] for [bus 00-ff] May 27 03:51:14.210346 kernel: PCI host bridge to bus 0003:00 May 27 03:51:14.210408 kernel: pci_bus 0003:00: root bus resource [mem 0x10000000-0x1fffffff window] May 27 03:51:14.210462 kernel: pci_bus 0003:00: root bus resource [mem 0x240000000000-0x27ffdfffffff window] May 27 03:51:14.210514 kernel: pci_bus 0003:00: root bus resource [bus 00-ff] May 27 03:51:14.210582 kernel: pci 0003:00:00.0: [1def:e110] type 00 class 0x060000 conventional PCI endpoint May 27 03:51:14.210649 kernel: pci 0003:00:01.0: [1def:e111] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.210709 kernel: pci 0003:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.210770 kernel: pci 0003:00:01.0: supports D1 D2 May 27 03:51:14.210829 kernel: pci 0003:00:01.0: PME# supported from D0 D1 D3hot May 27 03:51:14.210895 kernel: pci 0003:00:03.0: [1def:e113] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.210955 kernel: pci 0003:00:03.0: PCI bridge to [bus 02] May 27 03:51:14.211014 kernel: pci 0003:00:03.0: supports D1 D2 May 27 03:51:14.211072 kernel: pci 0003:00:03.0: PME# supported from D0 D1 D3hot May 27 03:51:14.211139 kernel: pci 0003:00:05.0: [1def:e115] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.211200 kernel: pci 0003:00:05.0: PCI bridge to [bus 03-04] May 27 03:51:14.211265 kernel: pci 0003:00:05.0: bridge window [io 0x0000-0x0fff] May 27 03:51:14.211325 kernel: pci 0003:00:05.0: bridge window [mem 0x10000000-0x100fffff] May 27 03:51:14.211383 kernel: pci 0003:00:05.0: bridge window [mem 0x240000000000-0x2400000fffff 64bit pref] May 27 03:51:14.211443 kernel: pci 0003:00:05.0: supports D1 D2 May 27 03:51:14.211504 kernel: pci 0003:00:05.0: PME# supported from D0 D1 D3hot May 27 03:51:14.211513 kernel: acpiphp: Slot [1-3] registered May 27 03:51:14.211520 kernel: acpiphp: Slot [2-3] registered May 27 03:51:14.211591 kernel: pci 0003:03:00.0: [8086:1521] type 00 class 0x020000 PCIe Endpoint May 27 03:51:14.211653 kernel: pci 0003:03:00.0: BAR 0 [mem 0x10020000-0x1003ffff] May 27 03:51:14.211714 kernel: pci 0003:03:00.0: BAR 2 [io 0x0020-0x003f] May 27 03:51:14.211775 kernel: pci 0003:03:00.0: BAR 3 [mem 0x10044000-0x10047fff] May 27 03:51:14.211836 kernel: pci 0003:03:00.0: PME# supported from D0 D3hot D3cold May 27 03:51:14.211896 kernel: pci 0003:03:00.0: VF BAR 0 [mem 0x240000060000-0x240000063fff 64bit pref] May 27 03:51:14.211958 kernel: pci 0003:03:00.0: VF BAR 0 [mem 0x240000060000-0x24000007ffff 64bit pref]: contains BAR 0 for 8 VFs May 27 03:51:14.212020 kernel: pci 0003:03:00.0: VF BAR 3 [mem 0x240000040000-0x240000043fff 64bit pref] May 27 03:51:14.212081 kernel: pci 0003:03:00.0: VF BAR 3 [mem 0x240000040000-0x24000005ffff 64bit pref]: contains BAR 3 for 8 VFs May 27 03:51:14.212142 kernel: pci 0003:03:00.0: 8.000 Gb/s available PCIe bandwidth, limited by 5.0 GT/s PCIe x2 link at 0003:00:05.0 (capable of 16.000 Gb/s with 5.0 GT/s PCIe x4 link) May 27 03:51:14.212213 kernel: pci 0003:03:00.1: [8086:1521] type 00 class 0x020000 PCIe Endpoint May 27 03:51:14.212276 kernel: pci 0003:03:00.1: BAR 0 [mem 0x10000000-0x1001ffff] May 27 03:51:14.212337 kernel: pci 0003:03:00.1: BAR 2 [io 0x0000-0x001f] May 27 03:51:14.212397 kernel: pci 0003:03:00.1: BAR 3 [mem 0x10040000-0x10043fff] May 27 03:51:14.212458 kernel: pci 0003:03:00.1: PME# supported from D0 D3hot D3cold May 27 03:51:14.212521 kernel: pci 0003:03:00.1: VF BAR 0 [mem 0x240000020000-0x240000023fff 64bit pref] May 27 03:51:14.212582 kernel: pci 0003:03:00.1: VF BAR 0 [mem 0x240000020000-0x24000003ffff 64bit pref]: contains BAR 0 for 8 VFs May 27 03:51:14.212642 kernel: pci 0003:03:00.1: VF BAR 3 [mem 0x240000000000-0x240000003fff 64bit pref] May 27 03:51:14.212714 kernel: pci 0003:03:00.1: VF BAR 3 [mem 0x240000000000-0x24000001ffff 64bit pref]: contains BAR 3 for 8 VFs May 27 03:51:14.212770 kernel: pci_bus 0003:00: on NUMA node 0 May 27 03:51:14.212831 kernel: pci 0003:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 03:51:14.212891 kernel: pci 0003:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.212953 kernel: pci 0003:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.213014 kernel: pci 0003:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 03:51:14.213073 kernel: pci 0003:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.213133 kernel: pci 0003:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.213193 kernel: pci 0003:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03-04] add_size 300000 add_align 100000 May 27 03:51:14.213256 kernel: pci 0003:00:05.0: bridge window [mem 0x00100000-0x001fffff] to [bus 03-04] add_size 100000 add_align 100000 May 27 03:51:14.213316 kernel: pci 0003:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned May 27 03:51:14.213378 kernel: pci 0003:00:01.0: bridge window [mem 0x240000000000-0x2400001fffff 64bit pref]: assigned May 27 03:51:14.213438 kernel: pci 0003:00:03.0: bridge window [mem 0x10200000-0x103fffff]: assigned May 27 03:51:14.213497 kernel: pci 0003:00:03.0: bridge window [mem 0x240000200000-0x2400003fffff 64bit pref]: assigned May 27 03:51:14.213556 kernel: pci 0003:00:05.0: bridge window [mem 0x10400000-0x105fffff]: assigned May 27 03:51:14.213615 kernel: pci 0003:00:05.0: bridge window [mem 0x240000400000-0x2400006fffff 64bit pref]: assigned May 27 03:51:14.213675 kernel: pci 0003:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.213734 kernel: pci 0003:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.213794 kernel: pci 0003:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.213856 kernel: pci 0003:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.213916 kernel: pci 0003:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.213976 kernel: pci 0003:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.214037 kernel: pci 0003:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.214098 kernel: pci 0003:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.214157 kernel: pci 0003:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.214221 kernel: pci 0003:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.214282 kernel: pci 0003:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.214341 kernel: pci 0003:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.214400 kernel: pci 0003:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.214459 kernel: pci 0003:00:01.0: bridge window [mem 0x10000000-0x101fffff] May 27 03:51:14.214518 kernel: pci 0003:00:01.0: bridge window [mem 0x240000000000-0x2400001fffff 64bit pref] May 27 03:51:14.214578 kernel: pci 0003:00:03.0: PCI bridge to [bus 02] May 27 03:51:14.214639 kernel: pci 0003:00:03.0: bridge window [mem 0x10200000-0x103fffff] May 27 03:51:14.214698 kernel: pci 0003:00:03.0: bridge window [mem 0x240000200000-0x2400003fffff 64bit pref] May 27 03:51:14.214760 kernel: pci 0003:03:00.0: BAR 0 [mem 0x10400000-0x1041ffff]: assigned May 27 03:51:14.214821 kernel: pci 0003:03:00.1: BAR 0 [mem 0x10420000-0x1043ffff]: assigned May 27 03:51:14.214883 kernel: pci 0003:03:00.0: BAR 3 [mem 0x10440000-0x10443fff]: assigned May 27 03:51:14.214944 kernel: pci 0003:03:00.0: VF BAR 0 [mem 0x240000400000-0x24000041ffff 64bit pref]: assigned May 27 03:51:14.215005 kernel: pci 0003:03:00.0: VF BAR 3 [mem 0x240000420000-0x24000043ffff 64bit pref]: assigned May 27 03:51:14.215066 kernel: pci 0003:03:00.1: BAR 3 [mem 0x10444000-0x10447fff]: assigned May 27 03:51:14.215129 kernel: pci 0003:03:00.1: VF BAR 0 [mem 0x240000440000-0x24000045ffff 64bit pref]: assigned May 27 03:51:14.215190 kernel: pci 0003:03:00.1: VF BAR 3 [mem 0x240000460000-0x24000047ffff 64bit pref]: assigned May 27 03:51:14.215258 kernel: pci 0003:03:00.0: BAR 2 [io size 0x0020]: can't assign; no space May 27 03:51:14.215321 kernel: pci 0003:03:00.0: BAR 2 [io size 0x0020]: failed to assign May 27 03:51:14.215383 kernel: pci 0003:03:00.1: BAR 2 [io size 0x0020]: can't assign; no space May 27 03:51:14.215444 kernel: pci 0003:03:00.1: BAR 2 [io size 0x0020]: failed to assign May 27 03:51:14.215506 kernel: pci 0003:03:00.0: BAR 2 [io size 0x0020]: can't assign; no space May 27 03:51:14.215567 kernel: pci 0003:03:00.0: BAR 2 [io size 0x0020]: failed to assign May 27 03:51:14.215627 kernel: pci 0003:03:00.1: BAR 2 [io size 0x0020]: can't assign; no space May 27 03:51:14.215689 kernel: pci 0003:03:00.1: BAR 2 [io size 0x0020]: failed to assign May 27 03:51:14.215748 kernel: pci 0003:00:05.0: PCI bridge to [bus 03-04] May 27 03:51:14.215808 kernel: pci 0003:00:05.0: bridge window [mem 0x10400000-0x105fffff] May 27 03:51:14.215867 kernel: pci 0003:00:05.0: bridge window [mem 0x240000400000-0x2400006fffff 64bit pref] May 27 03:51:14.215921 kernel: pci_bus 0003:00: Some PCI device resources are unassigned, try booting with pci=realloc May 27 03:51:14.215974 kernel: pci_bus 0003:00: resource 4 [mem 0x10000000-0x1fffffff window] May 27 03:51:14.216029 kernel: pci_bus 0003:00: resource 5 [mem 0x240000000000-0x27ffdfffffff window] May 27 03:51:14.216093 kernel: pci_bus 0003:01: resource 1 [mem 0x10000000-0x101fffff] May 27 03:51:14.216149 kernel: pci_bus 0003:01: resource 2 [mem 0x240000000000-0x2400001fffff 64bit pref] May 27 03:51:14.216223 kernel: pci_bus 0003:02: resource 1 [mem 0x10200000-0x103fffff] May 27 03:51:14.216280 kernel: pci_bus 0003:02: resource 2 [mem 0x240000200000-0x2400003fffff 64bit pref] May 27 03:51:14.216341 kernel: pci_bus 0003:03: resource 1 [mem 0x10400000-0x105fffff] May 27 03:51:14.216399 kernel: pci_bus 0003:03: resource 2 [mem 0x240000400000-0x2400006fffff 64bit pref] May 27 03:51:14.216409 kernel: ACPI: PCI Root Bridge [PCI0] (domain 000c [bus 00-ff]) May 27 03:51:14.216474 kernel: acpi PNP0A08:04: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:51:14.216532 kernel: acpi PNP0A08:04: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 03:51:14.216589 kernel: acpi PNP0A08:04: _OSC: OS now controls [AER PCIeCapability] May 27 03:51:14.216648 kernel: acpi PNP0A08:04: MCFG quirk: ECAM at [mem 0x33fff0000000-0x33ffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 03:51:14.216705 kernel: acpi PNP0A08:04: ECAM area [mem 0x33fff0000000-0x33ffffffffff] reserved by PNP0C02:00 May 27 03:51:14.216764 kernel: acpi PNP0A08:04: ECAM at [mem 0x33fff0000000-0x33ffffffffff] for [bus 00-ff] May 27 03:51:14.216774 kernel: PCI host bridge to bus 000c:00 May 27 03:51:14.216834 kernel: pci_bus 000c:00: root bus resource [mem 0x40000000-0x4fffffff window] May 27 03:51:14.216888 kernel: pci_bus 000c:00: root bus resource [mem 0x300000000000-0x33ffdfffffff window] May 27 03:51:14.216941 kernel: pci_bus 000c:00: root bus resource [bus 00-ff] May 27 03:51:14.217007 kernel: pci 000c:00:00.0: [1def:e100] type 00 class 0x060000 conventional PCI endpoint May 27 03:51:14.217075 kernel: pci 000c:00:01.0: [1def:e101] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.217138 kernel: pci 000c:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.217198 kernel: pci 000c:00:01.0: enabling Extended Tags May 27 03:51:14.217262 kernel: pci 000c:00:01.0: supports D1 D2 May 27 03:51:14.217322 kernel: pci 000c:00:01.0: PME# supported from D0 D1 D3hot May 27 03:51:14.217388 kernel: pci 000c:00:02.0: [1def:e102] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.217449 kernel: pci 000c:00:02.0: PCI bridge to [bus 02] May 27 03:51:14.217509 kernel: pci 000c:00:02.0: supports D1 D2 May 27 03:51:14.217570 kernel: pci 000c:00:02.0: PME# supported from D0 D1 D3hot May 27 03:51:14.217637 kernel: pci 000c:00:03.0: [1def:e103] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.217698 kernel: pci 000c:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.217758 kernel: pci 000c:00:03.0: supports D1 D2 May 27 03:51:14.217817 kernel: pci 000c:00:03.0: PME# supported from D0 D1 D3hot May 27 03:51:14.217884 kernel: pci 000c:00:04.0: [1def:e104] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.217945 kernel: pci 000c:00:04.0: PCI bridge to [bus 04] May 27 03:51:14.218007 kernel: pci 000c:00:04.0: supports D1 D2 May 27 03:51:14.218067 kernel: pci 000c:00:04.0: PME# supported from D0 D1 D3hot May 27 03:51:14.218077 kernel: acpiphp: Slot [1-4] registered May 27 03:51:14.218084 kernel: acpiphp: Slot [2-4] registered May 27 03:51:14.218092 kernel: acpiphp: Slot [3-2] registered May 27 03:51:14.218100 kernel: acpiphp: Slot [4-2] registered May 27 03:51:14.218152 kernel: pci_bus 000c:00: on NUMA node 0 May 27 03:51:14.218215 kernel: pci 000c:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 03:51:14.218278 kernel: pci 000c:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.218338 kernel: pci 000c:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.218398 kernel: pci 000c:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 03:51:14.218458 kernel: pci 000c:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.218519 kernel: pci 000c:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.218580 kernel: pci 000c:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 03:51:14.218639 kernel: pci 000c:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.218702 kernel: pci 000c:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.218763 kernel: pci 000c:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 03:51:14.218822 kernel: pci 000c:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.218882 kernel: pci 000c:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.218941 kernel: pci 000c:00:01.0: bridge window [mem 0x40000000-0x401fffff]: assigned May 27 03:51:14.219002 kernel: pci 000c:00:01.0: bridge window [mem 0x300000000000-0x3000001fffff 64bit pref]: assigned May 27 03:51:14.219061 kernel: pci 000c:00:02.0: bridge window [mem 0x40200000-0x403fffff]: assigned May 27 03:51:14.219123 kernel: pci 000c:00:02.0: bridge window [mem 0x300000200000-0x3000003fffff 64bit pref]: assigned May 27 03:51:14.219182 kernel: pci 000c:00:03.0: bridge window [mem 0x40400000-0x405fffff]: assigned May 27 03:51:14.219246 kernel: pci 000c:00:03.0: bridge window [mem 0x300000400000-0x3000005fffff 64bit pref]: assigned May 27 03:51:14.219306 kernel: pci 000c:00:04.0: bridge window [mem 0x40600000-0x407fffff]: assigned May 27 03:51:14.219366 kernel: pci 000c:00:04.0: bridge window [mem 0x300000600000-0x3000007fffff 64bit pref]: assigned May 27 03:51:14.219426 kernel: pci 000c:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.219486 kernel: pci 000c:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.219545 kernel: pci 000c:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.219607 kernel: pci 000c:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.219666 kernel: pci 000c:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.219726 kernel: pci 000c:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.219785 kernel: pci 000c:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.219844 kernel: pci 000c:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.219905 kernel: pci 000c:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.219964 kernel: pci 000c:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.220024 kernel: pci 000c:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.220084 kernel: pci 000c:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.220145 kernel: pci 000c:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.220207 kernel: pci 000c:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.220267 kernel: pci 000c:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.220327 kernel: pci 000c:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.220387 kernel: pci 000c:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.220447 kernel: pci 000c:00:01.0: bridge window [mem 0x40000000-0x401fffff] May 27 03:51:14.220508 kernel: pci 000c:00:01.0: bridge window [mem 0x300000000000-0x3000001fffff 64bit pref] May 27 03:51:14.220570 kernel: pci 000c:00:02.0: PCI bridge to [bus 02] May 27 03:51:14.220632 kernel: pci 000c:00:02.0: bridge window [mem 0x40200000-0x403fffff] May 27 03:51:14.220692 kernel: pci 000c:00:02.0: bridge window [mem 0x300000200000-0x3000003fffff 64bit pref] May 27 03:51:14.220751 kernel: pci 000c:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.220813 kernel: pci 000c:00:03.0: bridge window [mem 0x40400000-0x405fffff] May 27 03:51:14.220872 kernel: pci 000c:00:03.0: bridge window [mem 0x300000400000-0x3000005fffff 64bit pref] May 27 03:51:14.220932 kernel: pci 000c:00:04.0: PCI bridge to [bus 04] May 27 03:51:14.220992 kernel: pci 000c:00:04.0: bridge window [mem 0x40600000-0x407fffff] May 27 03:51:14.221052 kernel: pci 000c:00:04.0: bridge window [mem 0x300000600000-0x3000007fffff 64bit pref] May 27 03:51:14.221106 kernel: pci_bus 000c:00: resource 4 [mem 0x40000000-0x4fffffff window] May 27 03:51:14.221159 kernel: pci_bus 000c:00: resource 5 [mem 0x300000000000-0x33ffdfffffff window] May 27 03:51:14.221230 kernel: pci_bus 000c:01: resource 1 [mem 0x40000000-0x401fffff] May 27 03:51:14.221287 kernel: pci_bus 000c:01: resource 2 [mem 0x300000000000-0x3000001fffff 64bit pref] May 27 03:51:14.221350 kernel: pci_bus 000c:02: resource 1 [mem 0x40200000-0x403fffff] May 27 03:51:14.221406 kernel: pci_bus 000c:02: resource 2 [mem 0x300000200000-0x3000003fffff 64bit pref] May 27 03:51:14.221476 kernel: pci_bus 000c:03: resource 1 [mem 0x40400000-0x405fffff] May 27 03:51:14.221532 kernel: pci_bus 000c:03: resource 2 [mem 0x300000400000-0x3000005fffff 64bit pref] May 27 03:51:14.221596 kernel: pci_bus 000c:04: resource 1 [mem 0x40600000-0x407fffff] May 27 03:51:14.221652 kernel: pci_bus 000c:04: resource 2 [mem 0x300000600000-0x3000007fffff 64bit pref] May 27 03:51:14.221662 kernel: ACPI: PCI Root Bridge [PCI4] (domain 0002 [bus 00-ff]) May 27 03:51:14.221728 kernel: acpi PNP0A08:05: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:51:14.221788 kernel: acpi PNP0A08:05: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 03:51:14.221845 kernel: acpi PNP0A08:05: _OSC: OS now controls [AER PCIeCapability] May 27 03:51:14.221902 kernel: acpi PNP0A08:05: MCFG quirk: ECAM at [mem 0x23fff0000000-0x23ffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 03:51:14.221961 kernel: acpi PNP0A08:05: ECAM area [mem 0x23fff0000000-0x23ffffffffff] reserved by PNP0C02:00 May 27 03:51:14.222018 kernel: acpi PNP0A08:05: ECAM at [mem 0x23fff0000000-0x23ffffffffff] for [bus 00-ff] May 27 03:51:14.222028 kernel: PCI host bridge to bus 0002:00 May 27 03:51:14.222087 kernel: pci_bus 0002:00: root bus resource [mem 0x00800000-0x0fffffff window] May 27 03:51:14.222140 kernel: pci_bus 0002:00: root bus resource [mem 0x200000000000-0x23ffdfffffff window] May 27 03:51:14.222193 kernel: pci_bus 0002:00: root bus resource [bus 00-ff] May 27 03:51:14.222265 kernel: pci 0002:00:00.0: [1def:e110] type 00 class 0x060000 conventional PCI endpoint May 27 03:51:14.222335 kernel: pci 0002:00:01.0: [1def:e111] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.222397 kernel: pci 0002:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.222457 kernel: pci 0002:00:01.0: supports D1 D2 May 27 03:51:14.222516 kernel: pci 0002:00:01.0: PME# supported from D0 D1 D3hot May 27 03:51:14.222582 kernel: pci 0002:00:03.0: [1def:e113] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.222643 kernel: pci 0002:00:03.0: PCI bridge to [bus 02] May 27 03:51:14.222703 kernel: pci 0002:00:03.0: supports D1 D2 May 27 03:51:14.222764 kernel: pci 0002:00:03.0: PME# supported from D0 D1 D3hot May 27 03:51:14.222831 kernel: pci 0002:00:05.0: [1def:e115] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.222892 kernel: pci 0002:00:05.0: PCI bridge to [bus 03] May 27 03:51:14.222951 kernel: pci 0002:00:05.0: supports D1 D2 May 27 03:51:14.223011 kernel: pci 0002:00:05.0: PME# supported from D0 D1 D3hot May 27 03:51:14.223077 kernel: pci 0002:00:07.0: [1def:e117] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.223137 kernel: pci 0002:00:07.0: PCI bridge to [bus 04] May 27 03:51:14.223198 kernel: pci 0002:00:07.0: supports D1 D2 May 27 03:51:14.223286 kernel: pci 0002:00:07.0: PME# supported from D0 D1 D3hot May 27 03:51:14.223296 kernel: acpiphp: Slot [1-5] registered May 27 03:51:14.223304 kernel: acpiphp: Slot [2-5] registered May 27 03:51:14.223312 kernel: acpiphp: Slot [3-3] registered May 27 03:51:14.223320 kernel: acpiphp: Slot [4-3] registered May 27 03:51:14.223373 kernel: pci_bus 0002:00: on NUMA node 0 May 27 03:51:14.223434 kernel: pci 0002:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 03:51:14.223496 kernel: pci 0002:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.223556 kernel: pci 0002:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 03:51:14.223617 kernel: pci 0002:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 03:51:14.223677 kernel: pci 0002:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.223737 kernel: pci 0002:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.223797 kernel: pci 0002:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 03:51:14.223857 kernel: pci 0002:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.223919 kernel: pci 0002:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.223979 kernel: pci 0002:00:07.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 03:51:14.224038 kernel: pci 0002:00:07.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.224098 kernel: pci 0002:00:07.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.224158 kernel: pci 0002:00:01.0: bridge window [mem 0x00800000-0x009fffff]: assigned May 27 03:51:14.224221 kernel: pci 0002:00:01.0: bridge window [mem 0x200000000000-0x2000001fffff 64bit pref]: assigned May 27 03:51:14.224282 kernel: pci 0002:00:03.0: bridge window [mem 0x00a00000-0x00bfffff]: assigned May 27 03:51:14.224344 kernel: pci 0002:00:03.0: bridge window [mem 0x200000200000-0x2000003fffff 64bit pref]: assigned May 27 03:51:14.224405 kernel: pci 0002:00:05.0: bridge window [mem 0x00c00000-0x00dfffff]: assigned May 27 03:51:14.224465 kernel: pci 0002:00:05.0: bridge window [mem 0x200000400000-0x2000005fffff 64bit pref]: assigned May 27 03:51:14.224525 kernel: pci 0002:00:07.0: bridge window [mem 0x00e00000-0x00ffffff]: assigned May 27 03:51:14.224585 kernel: pci 0002:00:07.0: bridge window [mem 0x200000600000-0x2000007fffff 64bit pref]: assigned May 27 03:51:14.224644 kernel: pci 0002:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.224704 kernel: pci 0002:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.224763 kernel: pci 0002:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.224825 kernel: pci 0002:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.224885 kernel: pci 0002:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.224944 kernel: pci 0002:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.225003 kernel: pci 0002:00:07.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.225063 kernel: pci 0002:00:07.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.225122 kernel: pci 0002:00:07.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.225181 kernel: pci 0002:00:07.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.225243 kernel: pci 0002:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.225305 kernel: pci 0002:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.225364 kernel: pci 0002:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.225424 kernel: pci 0002:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.225483 kernel: pci 0002:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.225543 kernel: pci 0002:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.225602 kernel: pci 0002:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.225661 kernel: pci 0002:00:01.0: bridge window [mem 0x00800000-0x009fffff] May 27 03:51:14.225721 kernel: pci 0002:00:01.0: bridge window [mem 0x200000000000-0x2000001fffff 64bit pref] May 27 03:51:14.225782 kernel: pci 0002:00:03.0: PCI bridge to [bus 02] May 27 03:51:14.225842 kernel: pci 0002:00:03.0: bridge window [mem 0x00a00000-0x00bfffff] May 27 03:51:14.225901 kernel: pci 0002:00:03.0: bridge window [mem 0x200000200000-0x2000003fffff 64bit pref] May 27 03:51:14.225960 kernel: pci 0002:00:05.0: PCI bridge to [bus 03] May 27 03:51:14.226020 kernel: pci 0002:00:05.0: bridge window [mem 0x00c00000-0x00dfffff] May 27 03:51:14.226080 kernel: pci 0002:00:05.0: bridge window [mem 0x200000400000-0x2000005fffff 64bit pref] May 27 03:51:14.226139 kernel: pci 0002:00:07.0: PCI bridge to [bus 04] May 27 03:51:14.226201 kernel: pci 0002:00:07.0: bridge window [mem 0x00e00000-0x00ffffff] May 27 03:51:14.226270 kernel: pci 0002:00:07.0: bridge window [mem 0x200000600000-0x2000007fffff 64bit pref] May 27 03:51:14.226326 kernel: pci_bus 0002:00: resource 4 [mem 0x00800000-0x0fffffff window] May 27 03:51:14.226379 kernel: pci_bus 0002:00: resource 5 [mem 0x200000000000-0x23ffdfffffff window] May 27 03:51:14.226442 kernel: pci_bus 0002:01: resource 1 [mem 0x00800000-0x009fffff] May 27 03:51:14.226498 kernel: pci_bus 0002:01: resource 2 [mem 0x200000000000-0x2000001fffff 64bit pref] May 27 03:51:14.226564 kernel: pci_bus 0002:02: resource 1 [mem 0x00a00000-0x00bfffff] May 27 03:51:14.226620 kernel: pci_bus 0002:02: resource 2 [mem 0x200000200000-0x2000003fffff 64bit pref] May 27 03:51:14.226682 kernel: pci_bus 0002:03: resource 1 [mem 0x00c00000-0x00dfffff] May 27 03:51:14.226737 kernel: pci_bus 0002:03: resource 2 [mem 0x200000400000-0x2000005fffff 64bit pref] May 27 03:51:14.226806 kernel: pci_bus 0002:04: resource 1 [mem 0x00e00000-0x00ffffff] May 27 03:51:14.226862 kernel: pci_bus 0002:04: resource 2 [mem 0x200000600000-0x2000007fffff 64bit pref] May 27 03:51:14.226873 kernel: ACPI: PCI Root Bridge [PCI2] (domain 0001 [bus 00-ff]) May 27 03:51:14.226938 kernel: acpi PNP0A08:06: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:51:14.226997 kernel: acpi PNP0A08:06: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 03:51:14.227054 kernel: acpi PNP0A08:06: _OSC: OS now controls [AER PCIeCapability] May 27 03:51:14.227111 kernel: acpi PNP0A08:06: MCFG quirk: ECAM at [mem 0x3bfff0000000-0x3bffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 03:51:14.227168 kernel: acpi PNP0A08:06: ECAM area [mem 0x3bfff0000000-0x3bffffffffff] reserved by PNP0C02:00 May 27 03:51:14.227233 kernel: acpi PNP0A08:06: ECAM at [mem 0x3bfff0000000-0x3bffffffffff] for [bus 00-ff] May 27 03:51:14.227243 kernel: PCI host bridge to bus 0001:00 May 27 03:51:14.227303 kernel: pci_bus 0001:00: root bus resource [mem 0x60000000-0x6fffffff window] May 27 03:51:14.227356 kernel: pci_bus 0001:00: root bus resource [mem 0x380000000000-0x3bffdfffffff window] May 27 03:51:14.227409 kernel: pci_bus 0001:00: root bus resource [bus 00-ff] May 27 03:51:14.227475 kernel: pci 0001:00:00.0: [1def:e100] type 00 class 0x060000 conventional PCI endpoint May 27 03:51:14.227542 kernel: pci 0001:00:01.0: [1def:e101] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.227606 kernel: pci 0001:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.227665 kernel: pci 0001:00:01.0: bridge window [mem 0x60000000-0x601fffff] May 27 03:51:14.227725 kernel: pci 0001:00:01.0: bridge window [mem 0x380000000000-0x380003ffffff 64bit pref] May 27 03:51:14.227784 kernel: pci 0001:00:01.0: enabling Extended Tags May 27 03:51:14.227844 kernel: pci 0001:00:01.0: supports D1 D2 May 27 03:51:14.227904 kernel: pci 0001:00:01.0: PME# supported from D0 D1 D3hot May 27 03:51:14.227970 kernel: pci 0001:00:02.0: [1def:e102] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.228032 kernel: pci 0001:00:02.0: PCI bridge to [bus 02] May 27 03:51:14.228091 kernel: pci 0001:00:02.0: supports D1 D2 May 27 03:51:14.228151 kernel: pci 0001:00:02.0: PME# supported from D0 D1 D3hot May 27 03:51:14.228221 kernel: pci 0001:00:03.0: [1def:e103] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.228283 kernel: pci 0001:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.228343 kernel: pci 0001:00:03.0: supports D1 D2 May 27 03:51:14.228403 kernel: pci 0001:00:03.0: PME# supported from D0 D1 D3hot May 27 03:51:14.228470 kernel: pci 0001:00:04.0: [1def:e104] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.228531 kernel: pci 0001:00:04.0: PCI bridge to [bus 04] May 27 03:51:14.228591 kernel: pci 0001:00:04.0: supports D1 D2 May 27 03:51:14.228651 kernel: pci 0001:00:04.0: PME# supported from D0 D1 D3hot May 27 03:51:14.228660 kernel: acpiphp: Slot [1-6] registered May 27 03:51:14.228727 kernel: pci 0001:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint May 27 03:51:14.228790 kernel: pci 0001:01:00.0: BAR 0 [mem 0x380002000000-0x380003ffffff 64bit pref] May 27 03:51:14.228854 kernel: pci 0001:01:00.0: ROM [mem 0x60100000-0x601fffff pref] May 27 03:51:14.228915 kernel: pci 0001:01:00.0: PME# supported from D3cold May 27 03:51:14.228977 kernel: pci 0001:01:00.0: 31.504 Gb/s available PCIe bandwidth, limited by 8.0 GT/s PCIe x4 link at 0001:00:01.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) May 27 03:51:14.229046 kernel: pci 0001:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint May 27 03:51:14.229108 kernel: pci 0001:01:00.1: BAR 0 [mem 0x380000000000-0x380001ffffff 64bit pref] May 27 03:51:14.229169 kernel: pci 0001:01:00.1: ROM [mem 0x60000000-0x600fffff pref] May 27 03:51:14.229234 kernel: pci 0001:01:00.1: PME# supported from D3cold May 27 03:51:14.229246 kernel: acpiphp: Slot [2-6] registered May 27 03:51:14.229254 kernel: acpiphp: Slot [3-4] registered May 27 03:51:14.229262 kernel: acpiphp: Slot [4-4] registered May 27 03:51:14.229314 kernel: pci_bus 0001:00: on NUMA node 0 May 27 03:51:14.229374 kernel: pci 0001:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 03:51:14.229437 kernel: pci 0001:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 03:51:14.229497 kernel: pci 0001:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.229558 kernel: pci 0001:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 03:51:14.229621 kernel: pci 0001:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 03:51:14.229681 kernel: pci 0001:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.229741 kernel: pci 0001:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.229801 kernel: pci 0001:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 03:51:14.229860 kernel: pci 0001:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.229921 kernel: pci 0001:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.229981 kernel: pci 0001:00:01.0: bridge window [mem 0x380000000000-0x380003ffffff 64bit pref]: assigned May 27 03:51:14.230042 kernel: pci 0001:00:01.0: bridge window [mem 0x60000000-0x601fffff]: assigned May 27 03:51:14.230102 kernel: pci 0001:00:02.0: bridge window [mem 0x60200000-0x603fffff]: assigned May 27 03:51:14.230161 kernel: pci 0001:00:02.0: bridge window [mem 0x380004000000-0x3800041fffff 64bit pref]: assigned May 27 03:51:14.230224 kernel: pci 0001:00:03.0: bridge window [mem 0x60400000-0x605fffff]: assigned May 27 03:51:14.230283 kernel: pci 0001:00:03.0: bridge window [mem 0x380004200000-0x3800043fffff 64bit pref]: assigned May 27 03:51:14.230343 kernel: pci 0001:00:04.0: bridge window [mem 0x60600000-0x607fffff]: assigned May 27 03:51:14.230402 kernel: pci 0001:00:04.0: bridge window [mem 0x380004400000-0x3800045fffff 64bit pref]: assigned May 27 03:51:14.230462 kernel: pci 0001:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.230523 kernel: pci 0001:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.230583 kernel: pci 0001:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.230644 kernel: pci 0001:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.230703 kernel: pci 0001:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.230763 kernel: pci 0001:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.230823 kernel: pci 0001:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.230883 kernel: pci 0001:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.230943 kernel: pci 0001:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.231003 kernel: pci 0001:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.231062 kernel: pci 0001:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.231122 kernel: pci 0001:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.231181 kernel: pci 0001:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.231243 kernel: pci 0001:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.231303 kernel: pci 0001:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.231363 kernel: pci 0001:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.231426 kernel: pci 0001:01:00.0: BAR 0 [mem 0x380000000000-0x380001ffffff 64bit pref]: assigned May 27 03:51:14.231490 kernel: pci 0001:01:00.1: BAR 0 [mem 0x380002000000-0x380003ffffff 64bit pref]: assigned May 27 03:51:14.231551 kernel: pci 0001:01:00.0: ROM [mem 0x60000000-0x600fffff pref]: assigned May 27 03:51:14.231613 kernel: pci 0001:01:00.1: ROM [mem 0x60100000-0x601fffff pref]: assigned May 27 03:51:14.231673 kernel: pci 0001:00:01.0: PCI bridge to [bus 01] May 27 03:51:14.231733 kernel: pci 0001:00:01.0: bridge window [mem 0x60000000-0x601fffff] May 27 03:51:14.231792 kernel: pci 0001:00:01.0: bridge window [mem 0x380000000000-0x380003ffffff 64bit pref] May 27 03:51:14.231852 kernel: pci 0001:00:02.0: PCI bridge to [bus 02] May 27 03:51:14.231913 kernel: pci 0001:00:02.0: bridge window [mem 0x60200000-0x603fffff] May 27 03:51:14.231973 kernel: pci 0001:00:02.0: bridge window [mem 0x380004000000-0x3800041fffff 64bit pref] May 27 03:51:14.232033 kernel: pci 0001:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.232094 kernel: pci 0001:00:03.0: bridge window [mem 0x60400000-0x605fffff] May 27 03:51:14.232153 kernel: pci 0001:00:03.0: bridge window [mem 0x380004200000-0x3800043fffff 64bit pref] May 27 03:51:14.232216 kernel: pci 0001:00:04.0: PCI bridge to [bus 04] May 27 03:51:14.232277 kernel: pci 0001:00:04.0: bridge window [mem 0x60600000-0x607fffff] May 27 03:51:14.232338 kernel: pci 0001:00:04.0: bridge window [mem 0x380004400000-0x3800045fffff 64bit pref] May 27 03:51:14.232393 kernel: pci_bus 0001:00: resource 4 [mem 0x60000000-0x6fffffff window] May 27 03:51:14.232446 kernel: pci_bus 0001:00: resource 5 [mem 0x380000000000-0x3bffdfffffff window] May 27 03:51:14.232511 kernel: pci_bus 0001:01: resource 1 [mem 0x60000000-0x601fffff] May 27 03:51:14.232567 kernel: pci_bus 0001:01: resource 2 [mem 0x380000000000-0x380003ffffff 64bit pref] May 27 03:51:14.232637 kernel: pci_bus 0001:02: resource 1 [mem 0x60200000-0x603fffff] May 27 03:51:14.232696 kernel: pci_bus 0001:02: resource 2 [mem 0x380004000000-0x3800041fffff 64bit pref] May 27 03:51:14.232757 kernel: pci_bus 0001:03: resource 1 [mem 0x60400000-0x605fffff] May 27 03:51:14.232813 kernel: pci_bus 0001:03: resource 2 [mem 0x380004200000-0x3800043fffff 64bit pref] May 27 03:51:14.232874 kernel: pci_bus 0001:04: resource 1 [mem 0x60600000-0x607fffff] May 27 03:51:14.232930 kernel: pci_bus 0001:04: resource 2 [mem 0x380004400000-0x3800045fffff 64bit pref] May 27 03:51:14.232940 kernel: ACPI: PCI Root Bridge [PCI6] (domain 0004 [bus 00-ff]) May 27 03:51:14.233003 kernel: acpi PNP0A08:07: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 03:51:14.233063 kernel: acpi PNP0A08:07: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 03:51:14.233120 kernel: acpi PNP0A08:07: _OSC: OS now controls [AER PCIeCapability] May 27 03:51:14.233178 kernel: acpi PNP0A08:07: MCFG quirk: ECAM at [mem 0x2bfff0000000-0x2bffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 03:51:14.233240 kernel: acpi PNP0A08:07: ECAM area [mem 0x2bfff0000000-0x2bffffffffff] reserved by PNP0C02:00 May 27 03:51:14.233298 kernel: acpi PNP0A08:07: ECAM at [mem 0x2bfff0000000-0x2bffffffffff] for [bus 00-ff] May 27 03:51:14.233308 kernel: PCI host bridge to bus 0004:00 May 27 03:51:14.233368 kernel: pci_bus 0004:00: root bus resource [mem 0x20000000-0x2fffffff window] May 27 03:51:14.233423 kernel: pci_bus 0004:00: root bus resource [mem 0x280000000000-0x2bffdfffffff window] May 27 03:51:14.233477 kernel: pci_bus 0004:00: root bus resource [bus 00-ff] May 27 03:51:14.233545 kernel: pci 0004:00:00.0: [1def:e110] type 00 class 0x060000 conventional PCI endpoint May 27 03:51:14.233612 kernel: pci 0004:00:01.0: [1def:e111] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.233673 kernel: pci 0004:00:01.0: PCI bridge to [bus 01-02] May 27 03:51:14.233733 kernel: pci 0004:00:01.0: bridge window [io 0x0000-0x0fff] May 27 03:51:14.233794 kernel: pci 0004:00:01.0: bridge window [mem 0x20000000-0x220fffff] May 27 03:51:14.233853 kernel: pci 0004:00:01.0: supports D1 D2 May 27 03:51:14.233913 kernel: pci 0004:00:01.0: PME# supported from D0 D1 D3hot May 27 03:51:14.233979 kernel: pci 0004:00:03.0: [1def:e113] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.234039 kernel: pci 0004:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.234099 kernel: pci 0004:00:03.0: bridge window [mem 0x22200000-0x222fffff] May 27 03:51:14.234158 kernel: pci 0004:00:03.0: supports D1 D2 May 27 03:51:14.234224 kernel: pci 0004:00:03.0: PME# supported from D0 D1 D3hot May 27 03:51:14.234293 kernel: pci 0004:00:05.0: [1def:e115] type 01 class 0x060400 PCIe Root Port May 27 03:51:14.234354 kernel: pci 0004:00:05.0: PCI bridge to [bus 04] May 27 03:51:14.234414 kernel: pci 0004:00:05.0: supports D1 D2 May 27 03:51:14.234473 kernel: pci 0004:00:05.0: PME# supported from D0 D1 D3hot May 27 03:51:14.234540 kernel: pci 0004:01:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge May 27 03:51:14.234602 kernel: pci 0004:01:00.0: PCI bridge to [bus 02] May 27 03:51:14.234665 kernel: pci 0004:01:00.0: bridge window [io 0x0000-0x0fff] May 27 03:51:14.234728 kernel: pci 0004:01:00.0: bridge window [mem 0x20000000-0x220fffff] May 27 03:51:14.234789 kernel: pci 0004:01:00.0: enabling Extended Tags May 27 03:51:14.234850 kernel: pci 0004:01:00.0: supports D1 D2 May 27 03:51:14.234911 kernel: pci 0004:01:00.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 03:51:14.234979 kernel: pci_bus 0004:02: extended config space not accessible May 27 03:51:14.235050 kernel: pci 0004:02:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint May 27 03:51:14.235115 kernel: pci 0004:02:00.0: BAR 0 [mem 0x20000000-0x21ffffff] May 27 03:51:14.235180 kernel: pci 0004:02:00.0: BAR 1 [mem 0x22000000-0x2201ffff] May 27 03:51:14.235249 kernel: pci 0004:02:00.0: BAR 2 [io 0x0000-0x007f] May 27 03:51:14.235312 kernel: pci 0004:02:00.0: supports D1 D2 May 27 03:51:14.235376 kernel: pci 0004:02:00.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 03:51:14.235454 kernel: pci 0004:03:00.0: [1912:0014] type 00 class 0x0c0330 PCIe Endpoint May 27 03:51:14.235516 kernel: pci 0004:03:00.0: BAR 0 [mem 0x22200000-0x22201fff 64bit] May 27 03:51:14.235578 kernel: pci 0004:03:00.0: PME# supported from D0 D3hot D3cold May 27 03:51:14.235634 kernel: pci_bus 0004:00: on NUMA node 0 May 27 03:51:14.235695 kernel: pci 0004:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01-02] add_size 200000 add_align 100000 May 27 03:51:14.235756 kernel: pci 0004:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 03:51:14.235817 kernel: pci 0004:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 03:51:14.235876 kernel: pci 0004:00:03.0: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 May 27 03:51:14.235937 kernel: pci 0004:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 03:51:14.235996 kernel: pci 0004:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.236059 kernel: pci 0004:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 03:51:14.236118 kernel: pci 0004:00:01.0: bridge window [mem 0x20000000-0x22ffffff]: assigned May 27 03:51:14.236178 kernel: pci 0004:00:01.0: bridge window [mem 0x280000000000-0x2800001fffff 64bit pref]: assigned May 27 03:51:14.236241 kernel: pci 0004:00:03.0: bridge window [mem 0x23000000-0x231fffff]: assigned May 27 03:51:14.236302 kernel: pci 0004:00:03.0: bridge window [mem 0x280000200000-0x2800003fffff 64bit pref]: assigned May 27 03:51:14.236362 kernel: pci 0004:00:05.0: bridge window [mem 0x23200000-0x233fffff]: assigned May 27 03:51:14.236421 kernel: pci 0004:00:05.0: bridge window [mem 0x280000400000-0x2800005fffff 64bit pref]: assigned May 27 03:51:14.236482 kernel: pci 0004:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.236543 kernel: pci 0004:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.236602 kernel: pci 0004:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.236662 kernel: pci 0004:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.236722 kernel: pci 0004:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.236781 kernel: pci 0004:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.236841 kernel: pci 0004:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.236902 kernel: pci 0004:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.236963 kernel: pci 0004:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.237023 kernel: pci 0004:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.237083 kernel: pci 0004:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.237142 kernel: pci 0004:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.237204 kernel: pci 0004:01:00.0: bridge window [mem 0x20000000-0x22ffffff]: assigned May 27 03:51:14.237271 kernel: pci 0004:01:00.0: bridge window [io size 0x1000]: can't assign; no space May 27 03:51:14.237333 kernel: pci 0004:01:00.0: bridge window [io size 0x1000]: failed to assign May 27 03:51:14.237397 kernel: pci 0004:02:00.0: BAR 0 [mem 0x20000000-0x21ffffff]: assigned May 27 03:51:14.237463 kernel: pci 0004:02:00.0: BAR 1 [mem 0x22000000-0x2201ffff]: assigned May 27 03:51:14.237526 kernel: pci 0004:02:00.0: BAR 2 [io size 0x0080]: can't assign; no space May 27 03:51:14.237590 kernel: pci 0004:02:00.0: BAR 2 [io size 0x0080]: failed to assign May 27 03:51:14.237651 kernel: pci 0004:01:00.0: PCI bridge to [bus 02] May 27 03:51:14.237712 kernel: pci 0004:01:00.0: bridge window [mem 0x20000000-0x22ffffff] May 27 03:51:14.237772 kernel: pci 0004:00:01.0: PCI bridge to [bus 01-02] May 27 03:51:14.237833 kernel: pci 0004:00:01.0: bridge window [mem 0x20000000-0x22ffffff] May 27 03:51:14.237895 kernel: pci 0004:00:01.0: bridge window [mem 0x280000000000-0x2800001fffff 64bit pref] May 27 03:51:14.237957 kernel: pci 0004:03:00.0: BAR 0 [mem 0x23000000-0x23001fff 64bit]: assigned May 27 03:51:14.238018 kernel: pci 0004:00:03.0: PCI bridge to [bus 03] May 27 03:51:14.238078 kernel: pci 0004:00:03.0: bridge window [mem 0x23000000-0x231fffff] May 27 03:51:14.238138 kernel: pci 0004:00:03.0: bridge window [mem 0x280000200000-0x2800003fffff 64bit pref] May 27 03:51:14.238198 kernel: pci 0004:00:05.0: PCI bridge to [bus 04] May 27 03:51:14.238262 kernel: pci 0004:00:05.0: bridge window [mem 0x23200000-0x233fffff] May 27 03:51:14.238321 kernel: pci 0004:00:05.0: bridge window [mem 0x280000400000-0x2800005fffff 64bit pref] May 27 03:51:14.238377 kernel: pci_bus 0004:00: Some PCI device resources are unassigned, try booting with pci=realloc May 27 03:51:14.238430 kernel: pci_bus 0004:00: resource 4 [mem 0x20000000-0x2fffffff window] May 27 03:51:14.238484 kernel: pci_bus 0004:00: resource 5 [mem 0x280000000000-0x2bffdfffffff window] May 27 03:51:14.238547 kernel: pci_bus 0004:01: resource 1 [mem 0x20000000-0x22ffffff] May 27 03:51:14.238603 kernel: pci_bus 0004:01: resource 2 [mem 0x280000000000-0x2800001fffff 64bit pref] May 27 03:51:14.238662 kernel: pci_bus 0004:02: resource 1 [mem 0x20000000-0x22ffffff] May 27 03:51:14.238725 kernel: pci_bus 0004:03: resource 1 [mem 0x23000000-0x231fffff] May 27 03:51:14.238781 kernel: pci_bus 0004:03: resource 2 [mem 0x280000200000-0x2800003fffff 64bit pref] May 27 03:51:14.238845 kernel: pci_bus 0004:04: resource 1 [mem 0x23200000-0x233fffff] May 27 03:51:14.238901 kernel: pci_bus 0004:04: resource 2 [mem 0x280000400000-0x2800005fffff 64bit pref] May 27 03:51:14.238911 kernel: ACPI: CPU18 has been hot-added May 27 03:51:14.238918 kernel: ACPI: CPU58 has been hot-added May 27 03:51:14.238926 kernel: ACPI: CPU38 has been hot-added May 27 03:51:14.238934 kernel: ACPI: CPU78 has been hot-added May 27 03:51:14.238943 kernel: ACPI: CPU16 has been hot-added May 27 03:51:14.238950 kernel: ACPI: CPU56 has been hot-added May 27 03:51:14.238958 kernel: ACPI: CPU36 has been hot-added May 27 03:51:14.238966 kernel: ACPI: CPU76 has been hot-added May 27 03:51:14.238973 kernel: ACPI: CPU17 has been hot-added May 27 03:51:14.238981 kernel: ACPI: CPU57 has been hot-added May 27 03:51:14.238988 kernel: ACPI: CPU37 has been hot-added May 27 03:51:14.238996 kernel: ACPI: CPU77 has been hot-added May 27 03:51:14.239003 kernel: ACPI: CPU19 has been hot-added May 27 03:51:14.239012 kernel: ACPI: CPU59 has been hot-added May 27 03:51:14.239020 kernel: ACPI: CPU39 has been hot-added May 27 03:51:14.239027 kernel: ACPI: CPU79 has been hot-added May 27 03:51:14.239035 kernel: ACPI: CPU12 has been hot-added May 27 03:51:14.239042 kernel: ACPI: CPU52 has been hot-added May 27 03:51:14.239050 kernel: ACPI: CPU32 has been hot-added May 27 03:51:14.239058 kernel: ACPI: CPU72 has been hot-added May 27 03:51:14.239065 kernel: ACPI: CPU8 has been hot-added May 27 03:51:14.239073 kernel: ACPI: CPU48 has been hot-added May 27 03:51:14.239080 kernel: ACPI: CPU28 has been hot-added May 27 03:51:14.239089 kernel: ACPI: CPU68 has been hot-added May 27 03:51:14.239097 kernel: ACPI: CPU10 has been hot-added May 27 03:51:14.239104 kernel: ACPI: CPU50 has been hot-added May 27 03:51:14.239112 kernel: ACPI: CPU30 has been hot-added May 27 03:51:14.239119 kernel: ACPI: CPU70 has been hot-added May 27 03:51:14.239127 kernel: ACPI: CPU14 has been hot-added May 27 03:51:14.239134 kernel: ACPI: CPU54 has been hot-added May 27 03:51:14.239142 kernel: ACPI: CPU34 has been hot-added May 27 03:51:14.239149 kernel: ACPI: CPU74 has been hot-added May 27 03:51:14.239158 kernel: ACPI: CPU4 has been hot-added May 27 03:51:14.239165 kernel: ACPI: CPU44 has been hot-added May 27 03:51:14.239173 kernel: ACPI: CPU24 has been hot-added May 27 03:51:14.239180 kernel: ACPI: CPU64 has been hot-added May 27 03:51:14.239188 kernel: ACPI: CPU0 has been hot-added May 27 03:51:14.239196 kernel: ACPI: CPU40 has been hot-added May 27 03:51:14.239203 kernel: ACPI: CPU20 has been hot-added May 27 03:51:14.239214 kernel: ACPI: CPU60 has been hot-added May 27 03:51:14.239221 kernel: ACPI: CPU2 has been hot-added May 27 03:51:14.239229 kernel: ACPI: CPU42 has been hot-added May 27 03:51:14.239238 kernel: ACPI: CPU22 has been hot-added May 27 03:51:14.239246 kernel: ACPI: CPU62 has been hot-added May 27 03:51:14.239253 kernel: ACPI: CPU6 has been hot-added May 27 03:51:14.239262 kernel: ACPI: CPU46 has been hot-added May 27 03:51:14.239270 kernel: ACPI: CPU26 has been hot-added May 27 03:51:14.239277 kernel: ACPI: CPU66 has been hot-added May 27 03:51:14.239285 kernel: ACPI: CPU5 has been hot-added May 27 03:51:14.239292 kernel: ACPI: CPU45 has been hot-added May 27 03:51:14.239300 kernel: ACPI: CPU25 has been hot-added May 27 03:51:14.239309 kernel: ACPI: CPU65 has been hot-added May 27 03:51:14.239317 kernel: ACPI: CPU1 has been hot-added May 27 03:51:14.239324 kernel: ACPI: CPU41 has been hot-added May 27 03:51:14.239332 kernel: ACPI: CPU21 has been hot-added May 27 03:51:14.239339 kernel: ACPI: CPU61 has been hot-added May 27 03:51:14.239347 kernel: ACPI: CPU3 has been hot-added May 27 03:51:14.239354 kernel: ACPI: CPU43 has been hot-added May 27 03:51:14.239362 kernel: ACPI: CPU23 has been hot-added May 27 03:51:14.239370 kernel: ACPI: CPU63 has been hot-added May 27 03:51:14.239377 kernel: ACPI: CPU7 has been hot-added May 27 03:51:14.239386 kernel: ACPI: CPU47 has been hot-added May 27 03:51:14.239393 kernel: ACPI: CPU27 has been hot-added May 27 03:51:14.239401 kernel: ACPI: CPU67 has been hot-added May 27 03:51:14.239408 kernel: ACPI: CPU13 has been hot-added May 27 03:51:14.239416 kernel: ACPI: CPU53 has been hot-added May 27 03:51:14.239423 kernel: ACPI: CPU33 has been hot-added May 27 03:51:14.239431 kernel: ACPI: CPU73 has been hot-added May 27 03:51:14.239438 kernel: ACPI: CPU9 has been hot-added May 27 03:51:14.239446 kernel: ACPI: CPU49 has been hot-added May 27 03:51:14.239455 kernel: ACPI: CPU29 has been hot-added May 27 03:51:14.239462 kernel: ACPI: CPU69 has been hot-added May 27 03:51:14.239470 kernel: ACPI: CPU11 has been hot-added May 27 03:51:14.239477 kernel: ACPI: CPU51 has been hot-added May 27 03:51:14.239485 kernel: ACPI: CPU31 has been hot-added May 27 03:51:14.239492 kernel: ACPI: CPU71 has been hot-added May 27 03:51:14.239499 kernel: ACPI: CPU15 has been hot-added May 27 03:51:14.239507 kernel: ACPI: CPU55 has been hot-added May 27 03:51:14.239514 kernel: ACPI: CPU35 has been hot-added May 27 03:51:14.239522 kernel: ACPI: CPU75 has been hot-added May 27 03:51:14.239530 kernel: iommu: Default domain type: Translated May 27 03:51:14.239538 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 27 03:51:14.239546 kernel: efivars: Registered efivars operations May 27 03:51:14.239611 kernel: pci 0004:02:00.0: vgaarb: setting as boot VGA device May 27 03:51:14.239676 kernel: pci 0004:02:00.0: vgaarb: bridge control possible May 27 03:51:14.239739 kernel: pci 0004:02:00.0: vgaarb: VGA device added: decodes=io+mem,owns=none,locks=none May 27 03:51:14.239749 kernel: vgaarb: loaded May 27 03:51:14.239756 kernel: clocksource: Switched to clocksource arch_sys_counter May 27 03:51:14.239766 kernel: VFS: Disk quotas dquot_6.6.0 May 27 03:51:14.239774 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 03:51:14.239781 kernel: pnp: PnP ACPI init May 27 03:51:14.239846 kernel: system 00:00: [mem 0x3bfff0000000-0x3bffffffffff window] could not be reserved May 27 03:51:14.239903 kernel: system 00:00: [mem 0x3ffff0000000-0x3fffffffffff window] could not be reserved May 27 03:51:14.239958 kernel: system 00:00: [mem 0x23fff0000000-0x23ffffffffff window] could not be reserved May 27 03:51:14.240012 kernel: system 00:00: [mem 0x27fff0000000-0x27ffffffffff window] could not be reserved May 27 03:51:14.240065 kernel: system 00:00: [mem 0x2bfff0000000-0x2bffffffffff window] could not be reserved May 27 03:51:14.240121 kernel: system 00:00: [mem 0x2ffff0000000-0x2fffffffffff window] could not be reserved May 27 03:51:14.240176 kernel: system 00:00: [mem 0x33fff0000000-0x33ffffffffff window] could not be reserved May 27 03:51:14.240234 kernel: system 00:00: [mem 0x37fff0000000-0x37ffffffffff window] could not be reserved May 27 03:51:14.240243 kernel: pnp: PnP ACPI: found 1 devices May 27 03:51:14.240251 kernel: NET: Registered PF_INET protocol family May 27 03:51:14.240259 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 03:51:14.240267 kernel: tcp_listen_portaddr_hash hash table entries: 65536 (order: 8, 1048576 bytes, linear) May 27 03:51:14.240277 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 03:51:14.240284 kernel: TCP established hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 27 03:51:14.240292 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 27 03:51:14.240301 kernel: TCP: Hash tables configured (established 524288 bind 65536) May 27 03:51:14.240309 kernel: UDP hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 27 03:51:14.240317 kernel: UDP-Lite hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 27 03:51:14.240325 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 03:51:14.240387 kernel: pci 0001:01:00.0: CLS mismatch (64 != 32), using 64 bytes May 27 03:51:14.240398 kernel: kvm [1]: nv: 554 coarse grained trap handlers May 27 03:51:14.240407 kernel: kvm [1]: IPA Size Limit: 48 bits May 27 03:51:14.240414 kernel: kvm [1]: GICv3: no GICV resource entry May 27 03:51:14.240422 kernel: kvm [1]: disabling GICv2 emulation May 27 03:51:14.240430 kernel: kvm [1]: GIC system register CPU interface enabled May 27 03:51:14.240437 kernel: kvm [1]: vgic interrupt IRQ9 May 27 03:51:14.240445 kernel: kvm [1]: VHE mode initialized successfully May 27 03:51:14.240452 kernel: Initialise system trusted keyrings May 27 03:51:14.240460 kernel: workingset: timestamp_bits=39 max_order=26 bucket_order=0 May 27 03:51:14.240467 kernel: Key type asymmetric registered May 27 03:51:14.240477 kernel: Asymmetric key parser 'x509' registered May 27 03:51:14.240484 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) May 27 03:51:14.240492 kernel: io scheduler mq-deadline registered May 27 03:51:14.240499 kernel: io scheduler kyber registered May 27 03:51:14.240507 kernel: io scheduler bfq registered May 27 03:51:14.240514 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 27 03:51:14.240522 kernel: ACPI: button: Power Button [PWRB] May 27 03:51:14.240530 kernel: ACPI GTDT: found 1 SBSA generic Watchdog(s). May 27 03:51:14.240537 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 03:51:14.240604 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: option mask 0x0 May 27 03:51:14.240664 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: IDR0.COHACC overridden by FW configuration (false) May 27 03:51:14.240722 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 03:51:14.240777 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: allocated 262144 entries for cmdq May 27 03:51:14.240833 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: allocated 131072 entries for evtq May 27 03:51:14.240888 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: allocated 262144 entries for priq May 27 03:51:14.240951 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: option mask 0x0 May 27 03:51:14.241009 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: IDR0.COHACC overridden by FW configuration (false) May 27 03:51:14.241065 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 03:51:14.241120 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: allocated 262144 entries for cmdq May 27 03:51:14.241175 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: allocated 131072 entries for evtq May 27 03:51:14.241235 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: allocated 262144 entries for priq May 27 03:51:14.241298 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: option mask 0x0 May 27 03:51:14.241356 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: IDR0.COHACC overridden by FW configuration (false) May 27 03:51:14.241412 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 03:51:14.241467 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: allocated 262144 entries for cmdq May 27 03:51:14.241522 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: allocated 131072 entries for evtq May 27 03:51:14.241577 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: allocated 262144 entries for priq May 27 03:51:14.241639 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: option mask 0x0 May 27 03:51:14.241695 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: IDR0.COHACC overridden by FW configuration (false) May 27 03:51:14.241752 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 03:51:14.241807 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: allocated 262144 entries for cmdq May 27 03:51:14.241863 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: allocated 131072 entries for evtq May 27 03:51:14.241918 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: allocated 262144 entries for priq May 27 03:51:14.241980 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: option mask 0x0 May 27 03:51:14.242037 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: IDR0.COHACC overridden by FW configuration (false) May 27 03:51:14.242094 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 03:51:14.242149 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: allocated 262144 entries for cmdq May 27 03:51:14.242204 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: allocated 131072 entries for evtq May 27 03:51:14.242280 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: allocated 262144 entries for priq May 27 03:51:14.242348 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: option mask 0x0 May 27 03:51:14.242407 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: IDR0.COHACC overridden by FW configuration (false) May 27 03:51:14.242465 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 03:51:14.242524 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: allocated 262144 entries for cmdq May 27 03:51:14.242579 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: allocated 131072 entries for evtq May 27 03:51:14.242634 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: allocated 262144 entries for priq May 27 03:51:14.242704 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: option mask 0x0 May 27 03:51:14.242763 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: IDR0.COHACC overridden by FW configuration (false) May 27 03:51:14.242818 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 03:51:14.242873 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: allocated 262144 entries for cmdq May 27 03:51:14.242932 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: allocated 131072 entries for evtq May 27 03:51:14.242988 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: allocated 262144 entries for priq May 27 03:51:14.243050 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: option mask 0x0 May 27 03:51:14.243106 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: IDR0.COHACC overridden by FW configuration (false) May 27 03:51:14.243163 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 03:51:14.243225 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: allocated 262144 entries for cmdq May 27 03:51:14.243281 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: allocated 131072 entries for evtq May 27 03:51:14.243337 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: allocated 262144 entries for priq May 27 03:51:14.243347 kernel: thunder_xcv, ver 1.0 May 27 03:51:14.243355 kernel: thunder_bgx, ver 1.0 May 27 03:51:14.243363 kernel: nicpf, ver 1.0 May 27 03:51:14.243370 kernel: nicvf, ver 1.0 May 27 03:51:14.243431 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 27 03:51:14.243489 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-27T03:51:12 UTC (1748317872) May 27 03:51:14.243500 kernel: efifb: probing for efifb May 27 03:51:14.243507 kernel: efifb: framebuffer at 0x20000000, using 1876k, total 1875k May 27 03:51:14.243515 kernel: efifb: mode is 800x600x32, linelength=3200, pages=1 May 27 03:51:14.243523 kernel: efifb: scrolling: redraw May 27 03:51:14.243531 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 27 03:51:14.243538 kernel: Console: switching to colour frame buffer device 100x37 May 27 03:51:14.243546 kernel: fb0: EFI VGA frame buffer device May 27 03:51:14.243554 kernel: SMCCC: SOC_ID: ID = jep106:0a16:0001 Revision = 0x000000a1 May 27 03:51:14.243563 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 03:51:14.243571 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available May 27 03:51:14.243578 kernel: watchdog: NMI not fully supported May 27 03:51:14.243586 kernel: NET: Registered PF_INET6 protocol family May 27 03:51:14.243594 kernel: watchdog: Hard watchdog permanently disabled May 27 03:51:14.243601 kernel: Segment Routing with IPv6 May 27 03:51:14.243609 kernel: In-situ OAM (IOAM) with IPv6 May 27 03:51:14.243617 kernel: NET: Registered PF_PACKET protocol family May 27 03:51:14.243624 kernel: Key type dns_resolver registered May 27 03:51:14.243633 kernel: registered taskstats version 1 May 27 03:51:14.243641 kernel: Loading compiled-in X.509 certificates May 27 03:51:14.243649 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 6bbf5412ef1f8a32378a640b6d048f74e6d74df0' May 27 03:51:14.243656 kernel: Demotion targets for Node 0: null May 27 03:51:14.243664 kernel: Key type .fscrypt registered May 27 03:51:14.243671 kernel: Key type fscrypt-provisioning registered May 27 03:51:14.243679 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 03:51:14.243686 kernel: ima: Allocated hash algorithm: sha1 May 27 03:51:14.243694 kernel: ima: No architecture policies found May 27 03:51:14.243703 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 27 03:51:14.243765 kernel: pcieport 000d:00:01.0: Adding to iommu group 0 May 27 03:51:14.243826 kernel: pcieport 000d:00:01.0: AER: enabled with IRQ 91 May 27 03:51:14.243887 kernel: pcieport 000d:00:02.0: Adding to iommu group 1 May 27 03:51:14.243947 kernel: pcieport 000d:00:02.0: AER: enabled with IRQ 91 May 27 03:51:14.244008 kernel: pcieport 000d:00:03.0: Adding to iommu group 2 May 27 03:51:14.244068 kernel: pcieport 000d:00:03.0: AER: enabled with IRQ 91 May 27 03:51:14.244130 kernel: pcieport 000d:00:04.0: Adding to iommu group 3 May 27 03:51:14.244190 kernel: pcieport 000d:00:04.0: AER: enabled with IRQ 91 May 27 03:51:14.244257 kernel: pcieport 0000:00:01.0: Adding to iommu group 4 May 27 03:51:14.244318 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 92 May 27 03:51:14.244379 kernel: pcieport 0000:00:02.0: Adding to iommu group 5 May 27 03:51:14.244439 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 92 May 27 03:51:14.244500 kernel: pcieport 0000:00:03.0: Adding to iommu group 6 May 27 03:51:14.244560 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 92 May 27 03:51:14.244621 kernel: pcieport 0000:00:04.0: Adding to iommu group 7 May 27 03:51:14.244681 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 92 May 27 03:51:14.244744 kernel: pcieport 0005:00:01.0: Adding to iommu group 8 May 27 03:51:14.244806 kernel: pcieport 0005:00:01.0: AER: enabled with IRQ 93 May 27 03:51:14.244867 kernel: pcieport 0005:00:03.0: Adding to iommu group 9 May 27 03:51:14.244928 kernel: pcieport 0005:00:03.0: AER: enabled with IRQ 93 May 27 03:51:14.244989 kernel: pcieport 0005:00:05.0: Adding to iommu group 10 May 27 03:51:14.245049 kernel: pcieport 0005:00:05.0: AER: enabled with IRQ 93 May 27 03:51:14.245110 kernel: pcieport 0005:00:07.0: Adding to iommu group 11 May 27 03:51:14.245170 kernel: pcieport 0005:00:07.0: AER: enabled with IRQ 93 May 27 03:51:14.245237 kernel: pcieport 0003:00:01.0: Adding to iommu group 12 May 27 03:51:14.245298 kernel: pcieport 0003:00:01.0: AER: enabled with IRQ 94 May 27 03:51:14.245359 kernel: pcieport 0003:00:03.0: Adding to iommu group 13 May 27 03:51:14.245419 kernel: pcieport 0003:00:03.0: AER: enabled with IRQ 94 May 27 03:51:14.245480 kernel: pcieport 0003:00:05.0: Adding to iommu group 14 May 27 03:51:14.245539 kernel: pcieport 0003:00:05.0: AER: enabled with IRQ 94 May 27 03:51:14.245601 kernel: pcieport 000c:00:01.0: Adding to iommu group 15 May 27 03:51:14.245664 kernel: pcieport 000c:00:01.0: AER: enabled with IRQ 95 May 27 03:51:14.245727 kernel: pcieport 000c:00:02.0: Adding to iommu group 16 May 27 03:51:14.245789 kernel: pcieport 000c:00:02.0: AER: enabled with IRQ 95 May 27 03:51:14.245849 kernel: pcieport 000c:00:03.0: Adding to iommu group 17 May 27 03:51:14.245909 kernel: pcieport 000c:00:03.0: AER: enabled with IRQ 95 May 27 03:51:14.245971 kernel: pcieport 000c:00:04.0: Adding to iommu group 18 May 27 03:51:14.246031 kernel: pcieport 000c:00:04.0: AER: enabled with IRQ 95 May 27 03:51:14.246092 kernel: pcieport 0002:00:01.0: Adding to iommu group 19 May 27 03:51:14.246152 kernel: pcieport 0002:00:01.0: AER: enabled with IRQ 96 May 27 03:51:14.246215 kernel: pcieport 0002:00:03.0: Adding to iommu group 20 May 27 03:51:14.246279 kernel: pcieport 0002:00:03.0: AER: enabled with IRQ 96 May 27 03:51:14.246339 kernel: pcieport 0002:00:05.0: Adding to iommu group 21 May 27 03:51:14.246400 kernel: pcieport 0002:00:05.0: AER: enabled with IRQ 96 May 27 03:51:14.246461 kernel: pcieport 0002:00:07.0: Adding to iommu group 22 May 27 03:51:14.246521 kernel: pcieport 0002:00:07.0: AER: enabled with IRQ 96 May 27 03:51:14.246583 kernel: pcieport 0001:00:01.0: Adding to iommu group 23 May 27 03:51:14.246643 kernel: pcieport 0001:00:01.0: AER: enabled with IRQ 97 May 27 03:51:14.246704 kernel: pcieport 0001:00:02.0: Adding to iommu group 24 May 27 03:51:14.246765 kernel: pcieport 0001:00:02.0: AER: enabled with IRQ 97 May 27 03:51:14.246826 kernel: pcieport 0001:00:03.0: Adding to iommu group 25 May 27 03:51:14.246886 kernel: pcieport 0001:00:03.0: AER: enabled with IRQ 97 May 27 03:51:14.246946 kernel: pcieport 0001:00:04.0: Adding to iommu group 26 May 27 03:51:14.247006 kernel: pcieport 0001:00:04.0: AER: enabled with IRQ 97 May 27 03:51:14.247067 kernel: pcieport 0004:00:01.0: Adding to iommu group 27 May 27 03:51:14.247127 kernel: pcieport 0004:00:01.0: AER: enabled with IRQ 98 May 27 03:51:14.247188 kernel: pcieport 0004:00:03.0: Adding to iommu group 28 May 27 03:51:14.247270 kernel: pcieport 0004:00:03.0: AER: enabled with IRQ 98 May 27 03:51:14.247332 kernel: pcieport 0004:00:05.0: Adding to iommu group 29 May 27 03:51:14.247391 kernel: pcieport 0004:00:05.0: AER: enabled with IRQ 98 May 27 03:51:14.247455 kernel: pcieport 0004:01:00.0: Adding to iommu group 30 May 27 03:51:14.247465 kernel: clk: Disabling unused clocks May 27 03:51:14.247473 kernel: PM: genpd: Disabling unused power domains May 27 03:51:14.247481 kernel: Warning: unable to open an initial console. May 27 03:51:14.247488 kernel: Freeing unused kernel memory: 39424K May 27 03:51:14.247496 kernel: Run /init as init process May 27 03:51:14.247506 kernel: with arguments: May 27 03:51:14.247514 kernel: /init May 27 03:51:14.247521 kernel: with environment: May 27 03:51:14.247528 kernel: HOME=/ May 27 03:51:14.247536 kernel: TERM=linux May 27 03:51:14.247543 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 03:51:14.247552 systemd[1]: Successfully made /usr/ read-only. May 27 03:51:14.247562 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 03:51:14.247573 systemd[1]: Detected architecture arm64. May 27 03:51:14.247580 systemd[1]: Running in initrd. May 27 03:51:14.247588 systemd[1]: No hostname configured, using default hostname. May 27 03:51:14.247596 systemd[1]: Hostname set to . May 27 03:51:14.247604 systemd[1]: Initializing machine ID from random generator. May 27 03:51:14.247612 systemd[1]: Queued start job for default target initrd.target. May 27 03:51:14.247620 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:51:14.247628 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:51:14.247638 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 03:51:14.247646 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 03:51:14.247654 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 03:51:14.247662 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 03:51:14.247672 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 03:51:14.247680 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 03:51:14.247689 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:51:14.247697 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 03:51:14.247705 systemd[1]: Reached target paths.target - Path Units. May 27 03:51:14.247713 systemd[1]: Reached target slices.target - Slice Units. May 27 03:51:14.247721 systemd[1]: Reached target swap.target - Swaps. May 27 03:51:14.247729 systemd[1]: Reached target timers.target - Timer Units. May 27 03:51:14.247737 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 03:51:14.247745 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 03:51:14.247753 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 03:51:14.247762 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 03:51:14.247770 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 03:51:14.247778 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 03:51:14.247786 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:51:14.247794 systemd[1]: Reached target sockets.target - Socket Units. May 27 03:51:14.247803 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 03:51:14.247811 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 03:51:14.247819 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 03:51:14.247827 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 03:51:14.247836 systemd[1]: Starting systemd-fsck-usr.service... May 27 03:51:14.247844 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 03:51:14.247852 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 03:51:14.247860 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:51:14.247888 systemd-journald[910]: Collecting audit messages is disabled. May 27 03:51:14.247908 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 03:51:14.247917 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 03:51:14.247925 kernel: Bridge firewalling registered May 27 03:51:14.247933 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:51:14.247944 systemd-journald[910]: Journal started May 27 03:51:14.247962 systemd-journald[910]: Runtime Journal (/run/log/journal/2426c7a6387e445894edfe1a919ba22c) is 8M, max 4G, 3.9G free. May 27 03:51:14.187644 systemd-modules-load[913]: Inserted module 'overlay' May 27 03:51:14.211004 systemd-modules-load[913]: Inserted module 'br_netfilter' May 27 03:51:14.295388 systemd[1]: Started systemd-journald.service - Journal Service. May 27 03:51:14.302227 systemd[1]: Finished systemd-fsck-usr.service. May 27 03:51:14.307825 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 03:51:14.318516 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:51:14.333098 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 03:51:14.341063 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 03:51:14.372747 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 03:51:14.379415 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 03:51:14.392013 systemd-tmpfiles[943]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 03:51:14.398842 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 03:51:14.413697 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 03:51:14.430151 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 03:51:14.441384 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:51:14.461219 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 03:51:14.492520 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 03:51:14.499818 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 03:51:14.524937 dracut-cmdline[964]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=packet flatcar.autologin verity.usrhash=4c3f98aae7a61b3dcbab6391ba922461adab29dbcb79fd6e18169f93c5a4ab5a May 27 03:51:14.514286 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:51:14.532474 systemd-resolved[967]: Positive Trust Anchors: May 27 03:51:14.532483 systemd-resolved[967]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 03:51:14.532515 systemd-resolved[967]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 03:51:14.548066 systemd-resolved[967]: Defaulting to hostname 'linux'. May 27 03:51:14.549622 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 03:51:14.571253 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 03:51:14.692213 kernel: SCSI subsystem initialized May 27 03:51:14.707215 kernel: Loading iSCSI transport class v2.0-870. May 27 03:51:14.727215 kernel: iscsi: registered transport (tcp) May 27 03:51:14.754836 kernel: iscsi: registered transport (qla4xxx) May 27 03:51:14.754857 kernel: QLogic iSCSI HBA Driver May 27 03:51:14.773610 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 03:51:14.800469 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:51:14.817019 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 03:51:14.868217 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 03:51:14.875816 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 03:51:14.961221 kernel: raid6: neonx8 gen() 15858 MB/s May 27 03:51:14.987217 kernel: raid6: neonx4 gen() 15883 MB/s May 27 03:51:15.013217 kernel: raid6: neonx2 gen() 13249 MB/s May 27 03:51:15.038216 kernel: raid6: neonx1 gen() 10593 MB/s May 27 03:51:15.063217 kernel: raid6: int64x8 gen() 6925 MB/s May 27 03:51:15.088216 kernel: raid6: int64x4 gen() 7365 MB/s May 27 03:51:15.113215 kernel: raid6: int64x2 gen() 6130 MB/s May 27 03:51:15.141654 kernel: raid6: int64x1 gen() 5077 MB/s May 27 03:51:15.141675 kernel: raid6: using algorithm neonx4 gen() 15883 MB/s May 27 03:51:15.176077 kernel: raid6: .... xor() 12435 MB/s, rmw enabled May 27 03:51:15.176101 kernel: raid6: using neon recovery algorithm May 27 03:51:15.200855 kernel: xor: measuring software checksum speed May 27 03:51:15.200877 kernel: 8regs : 21618 MB/sec May 27 03:51:15.217492 kernel: 32regs : 21343 MB/sec May 27 03:51:15.217513 kernel: arm64_neon : 28215 MB/sec May 27 03:51:15.225583 kernel: xor: using function: arm64_neon (28215 MB/sec) May 27 03:51:15.291215 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 03:51:15.296968 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 03:51:15.306845 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:51:15.350620 systemd-udevd[1185]: Using default interface naming scheme 'v255'. May 27 03:51:15.354822 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:51:15.360979 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 03:51:15.399199 dracut-pre-trigger[1194]: rd.md=0: removing MD RAID activation May 27 03:51:15.422284 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 03:51:15.433330 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 03:51:15.731250 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:51:15.853837 kernel: pps_core: LinuxPPS API ver. 1 registered May 27 03:51:15.853867 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti May 27 03:51:15.853886 kernel: ACPI: bus type USB registered May 27 03:51:15.853905 kernel: PTP clock support registered May 27 03:51:15.853929 kernel: usbcore: registered new interface driver usbfs May 27 03:51:15.853947 kernel: usbcore: registered new interface driver hub May 27 03:51:15.853965 kernel: nvme 0005:03:00.0: Adding to iommu group 31 May 27 03:51:15.854140 kernel: usbcore: registered new device driver usb May 27 03:51:15.854151 kernel: nvme 0005:04:00.0: Adding to iommu group 32 May 27 03:51:15.854248 kernel: nvme nvme0: pci function 0005:03:00.0 May 27 03:51:15.854340 kernel: nvme nvme1: pci function 0005:04:00.0 May 27 03:51:15.854420 kernel: nvme nvme0: D3 entry latency set to 8 seconds May 27 03:51:15.856216 kernel: nvme nvme1: D3 entry latency set to 8 seconds May 27 03:51:15.872349 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 03:51:15.880215 kernel: nvme nvme1: 32/0/0 default/read/poll queues May 27 03:51:15.880310 kernel: nvme nvme0: 32/0/0 default/read/poll queues May 27 03:51:15.919694 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 03:51:15.919708 kernel: GPT:9289727 != 1875385007 May 27 03:51:15.919717 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 03:51:15.919726 kernel: GPT:9289727 != 1875385007 May 27 03:51:15.919734 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 03:51:15.919743 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 03:51:15.961478 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:51:15.961627 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:51:15.971136 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:51:16.076113 kernel: xhci_hcd 0004:03:00.0: Adding to iommu group 33 May 27 03:51:16.076265 kernel: igb: Intel(R) Gigabit Ethernet Network Driver May 27 03:51:16.076277 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. May 27 03:51:16.076286 kernel: igb 0003:03:00.0: Adding to iommu group 34 May 27 03:51:16.076377 kernel: xhci_hcd 0004:03:00.0: xHCI Host Controller May 27 03:51:16.076454 kernel: xhci_hcd 0004:03:00.0: new USB bus registered, assigned bus number 1 May 27 03:51:16.076529 kernel: xhci_hcd 0004:03:00.0: Zeroing 64bit base registers, expecting fault May 27 03:51:16.076602 kernel: mlx5_core 0001:01:00.0: Adding to iommu group 35 May 27 03:51:16.072110 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:51:16.081746 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 03:51:16.097288 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 03:51:16.117868 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:51:16.136645 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - SAMSUNG MZ1LB960HAJQ-00007 EFI-SYSTEM. May 27 03:51:16.163042 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - SAMSUNG MZ1LB960HAJQ-00007 ROOT. May 27 03:51:16.186155 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - SAMSUNG MZ1LB960HAJQ-00007 USR-A. May 27 03:51:16.191980 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - SAMSUNG MZ1LB960HAJQ-00007 USR-A. May 27 03:51:16.216872 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - SAMSUNG MZ1LB960HAJQ-00007 OEM. May 27 03:51:16.373398 kernel: xhci_hcd 0004:03:00.0: hcc params 0x014051cf hci version 0x100 quirks 0x0000000100000010 May 27 03:51:16.373557 kernel: xhci_hcd 0004:03:00.0: xHCI Host Controller May 27 03:51:16.373637 kernel: xhci_hcd 0004:03:00.0: new USB bus registered, assigned bus number 2 May 27 03:51:16.373713 kernel: xhci_hcd 0004:03:00.0: Host supports USB 3.0 SuperSpeed May 27 03:51:16.373787 kernel: hub 1-0:1.0: USB hub found May 27 03:51:16.373881 kernel: hub 1-0:1.0: 4 ports detected May 27 03:51:16.373954 kernel: mlx5_core 0001:01:00.0: PTM is not supported by PCIe May 27 03:51:16.374037 kernel: mlx5_core 0001:01:00.0: firmware version: 14.30.1004 May 27 03:51:16.374111 kernel: mlx5_core 0001:01:00.0: 31.504 Gb/s available PCIe bandwidth, limited by 8.0 GT/s PCIe x4 link at 0001:00:01.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) May 27 03:51:16.374184 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 27 03:51:16.291111 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 03:51:16.379017 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:51:16.435518 kernel: hub 2-0:1.0: USB hub found May 27 03:51:16.435658 kernel: hub 2-0:1.0: 4 ports detected May 27 03:51:16.435743 kernel: igb 0003:03:00.0: added PHC on eth0 May 27 03:51:16.435829 kernel: igb 0003:03:00.0: Intel(R) Gigabit Ethernet Network Connection May 27 03:51:16.435902 kernel: igb 0003:03:00.0: eth0: (PCIe:5.0Gb/s:Width x2) 18:c0:4d:80:54:6c May 27 03:51:16.435974 kernel: igb 0003:03:00.0: eth0: PBA No: 106300-000 May 27 03:51:16.395644 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 03:51:16.498758 kernel: igb 0003:03:00.0: Using MSI-X interrupts. 8 rx queue(s), 8 tx queue(s) May 27 03:51:16.498862 kernel: igb 0003:03:00.1: Adding to iommu group 36 May 27 03:51:16.494324 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 03:51:16.520736 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 03:51:16.664831 kernel: igb 0003:03:00.1: added PHC on eth1 May 27 03:51:16.664979 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 03:51:16.664991 kernel: igb 0003:03:00.1: Intel(R) Gigabit Ethernet Network Connection May 27 03:51:16.665069 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 03:51:16.665079 kernel: igb 0003:03:00.1: eth1: (PCIe:5.0Gb/s:Width x2) 18:c0:4d:80:54:6d May 27 03:51:16.665153 kernel: igb 0003:03:00.1: eth1: PBA No: 106300-000 May 27 03:51:16.665232 kernel: igb 0003:03:00.1: Using MSI-X interrupts. 8 rx queue(s), 8 tx queue(s) May 27 03:51:16.665307 kernel: igb 0003:03:00.1 eno2: renamed from eth1 May 27 03:51:16.665382 kernel: mlx5_core 0001:01:00.0: E-Switch: Total vports 2, per vport: max uc(1024) max mc(16384) May 27 03:51:16.665465 kernel: usb 1-3: new high-speed USB device number 2 using xhci_hcd May 27 03:51:16.665486 kernel: igb 0003:03:00.0 eno1: renamed from eth0 May 27 03:51:16.665567 kernel: mlx5_core 0001:01:00.0: Port module event: module 0, Cable plugged May 27 03:51:16.665651 disk-uuid[1340]: Primary Header is updated. May 27 03:51:16.665651 disk-uuid[1340]: Secondary Entries is updated. May 27 03:51:16.665651 disk-uuid[1340]: Secondary Header is updated. May 27 03:51:16.684810 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 03:51:16.775437 kernel: hub 1-3:1.0: USB hub found May 27 03:51:16.775653 kernel: hub 1-3:1.0: 4 ports detected May 27 03:51:16.880219 kernel: usb 2-3: new SuperSpeed USB device number 2 using xhci_hcd May 27 03:51:16.907217 kernel: hub 2-3:1.0: USB hub found May 27 03:51:16.916210 kernel: hub 2-3:1.0: 4 ports detected May 27 03:51:16.934214 kernel: mlx5_core 0001:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) May 27 03:51:16.946219 kernel: mlx5_core 0001:01:00.1: Adding to iommu group 37 May 27 03:51:16.962923 kernel: mlx5_core 0001:01:00.1: PTM is not supported by PCIe May 27 03:51:16.963075 kernel: mlx5_core 0001:01:00.1: firmware version: 14.30.1004 May 27 03:51:16.978134 kernel: mlx5_core 0001:01:00.1: 31.504 Gb/s available PCIe bandwidth, limited by 8.0 GT/s PCIe x4 link at 0001:00:01.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) May 27 03:51:17.259218 kernel: mlx5_core 0001:01:00.1: E-Switch: Total vports 2, per vport: max uc(1024) max mc(16384) May 27 03:51:17.278557 kernel: mlx5_core 0001:01:00.1: Port module event: module 1, Cable plugged May 27 03:51:17.546219 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 03:51:17.546366 disk-uuid[1341]: The operation has completed successfully. May 27 03:51:17.585222 kernel: mlx5_core 0001:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) May 27 03:51:17.600216 kernel: mlx5_core 0001:01:00.1 enP1p1s0f1np1: renamed from eth1 May 27 03:51:17.600311 kernel: mlx5_core 0001:01:00.0 enP1p1s0f0np0: renamed from eth0 May 27 03:51:17.646080 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 03:51:17.646174 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 03:51:17.653094 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 03:51:17.679789 sh[1542]: Success May 27 03:51:17.719400 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 03:51:17.719433 kernel: device-mapper: uevent: version 1.0.3 May 27 03:51:17.728818 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 03:51:17.756214 kernel: device-mapper: verity: sha256 using shash "sha256-ce" May 27 03:51:17.787785 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 03:51:17.795651 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 03:51:17.814161 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 03:51:17.821212 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 03:51:17.821229 kernel: BTRFS: device fsid 5c6341ea-4eb5-44b6-ac57-c4d29847e384 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (1560) May 27 03:51:17.824210 kernel: BTRFS info (device dm-0): first mount of filesystem 5c6341ea-4eb5-44b6-ac57-c4d29847e384 May 27 03:51:17.824221 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 27 03:51:17.824231 kernel: BTRFS info (device dm-0): using free-space-tree May 27 03:51:17.911277 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 03:51:17.917519 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 03:51:17.927828 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 03:51:17.928985 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 03:51:17.945793 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 03:51:18.066035 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 (259:6) scanned by mount (1583) May 27 03:51:18.066056 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 03:51:18.066067 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 03:51:18.066076 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 03:51:18.066086 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 03:51:18.061407 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 03:51:18.075643 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 03:51:18.099304 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 03:51:18.122560 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 03:51:18.170708 systemd-networkd[1734]: lo: Link UP May 27 03:51:18.170714 systemd-networkd[1734]: lo: Gained carrier May 27 03:51:18.174511 systemd-networkd[1734]: Enumeration completed May 27 03:51:18.174611 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 03:51:18.175770 systemd-networkd[1734]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:51:18.182156 systemd[1]: Reached target network.target - Network. May 27 03:51:18.218067 ignition[1726]: Ignition 2.21.0 May 27 03:51:18.218073 ignition[1726]: Stage: fetch-offline May 27 03:51:18.224083 unknown[1726]: fetched base config from "system" May 27 03:51:18.218117 ignition[1726]: no configs at "/usr/lib/ignition/base.d" May 27 03:51:18.224091 unknown[1726]: fetched user config from "system" May 27 03:51:18.218125 ignition[1726]: no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 03:51:18.227628 systemd-networkd[1734]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:51:18.218447 ignition[1726]: parsed url from cmdline: "" May 27 03:51:18.227861 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 03:51:18.218449 ignition[1726]: no config URL provided May 27 03:51:18.237700 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 27 03:51:18.218454 ignition[1726]: reading system config file "/usr/lib/ignition/user.ign" May 27 03:51:18.238610 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 03:51:18.218505 ignition[1726]: parsing config with SHA512: cbe38912de2731ca7590d1e4aae81b622d569ac898487efddd884e418139415f48e08ea68d8139b0b538efddf9873047f0ca1cee716eaa1233297a09a47d68b3 May 27 03:51:18.278748 systemd-networkd[1734]: enP1p1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:51:18.224401 ignition[1726]: fetch-offline: fetch-offline passed May 27 03:51:18.224406 ignition[1726]: POST message to Packet Timeline May 27 03:51:18.224410 ignition[1726]: POST Status error: resource requires networking May 27 03:51:18.224459 ignition[1726]: Ignition finished successfully May 27 03:51:18.290235 ignition[1786]: Ignition 2.21.0 May 27 03:51:18.290241 ignition[1786]: Stage: kargs May 27 03:51:18.290626 ignition[1786]: no configs at "/usr/lib/ignition/base.d" May 27 03:51:18.290635 ignition[1786]: no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 03:51:18.292347 ignition[1786]: kargs: kargs passed May 27 03:51:18.292352 ignition[1786]: POST message to Packet Timeline May 27 03:51:18.292761 ignition[1786]: GET https://metadata.packet.net/metadata: attempt #1 May 27 03:51:18.301732 ignition[1786]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47528->[::1]:53: read: connection refused May 27 03:51:18.501863 ignition[1786]: GET https://metadata.packet.net/metadata: attempt #2 May 27 03:51:18.502485 ignition[1786]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:50632->[::1]:53: read: connection refused May 27 03:51:18.860219 kernel: mlx5_core 0001:01:00.0 enP1p1s0f0np0: Link up May 27 03:51:18.863201 systemd-networkd[1734]: enP1p1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:51:18.903104 ignition[1786]: GET https://metadata.packet.net/metadata: attempt #3 May 27 03:51:18.903609 ignition[1786]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:46156->[::1]:53: read: connection refused May 27 03:51:19.476216 kernel: mlx5_core 0001:01:00.1 enP1p1s0f1np1: Link up May 27 03:51:19.478899 systemd-networkd[1734]: eno1: Link UP May 27 03:51:19.479031 systemd-networkd[1734]: eno2: Link UP May 27 03:51:19.479150 systemd-networkd[1734]: enP1p1s0f0np0: Link UP May 27 03:51:19.479359 systemd-networkd[1734]: enP1p1s0f0np0: Gained carrier May 27 03:51:19.497419 systemd-networkd[1734]: enP1p1s0f1np1: Link UP May 27 03:51:19.498460 systemd-networkd[1734]: enP1p1s0f1np1: Gained carrier May 27 03:51:19.540238 systemd-networkd[1734]: enP1p1s0f0np0: DHCPv4 address 147.28.163.138/30, gateway 147.28.163.137 acquired from 147.28.144.140 May 27 03:51:19.704358 ignition[1786]: GET https://metadata.packet.net/metadata: attempt #4 May 27 03:51:19.705116 ignition[1786]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:46881->[::1]:53: read: connection refused May 27 03:51:20.499340 systemd-networkd[1734]: enP1p1s0f0np0: Gained IPv6LL May 27 03:51:20.627451 systemd-networkd[1734]: enP1p1s0f1np1: Gained IPv6LL May 27 03:51:21.305881 ignition[1786]: GET https://metadata.packet.net/metadata: attempt #5 May 27 03:51:21.306611 ignition[1786]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:58428->[::1]:53: read: connection refused May 27 03:51:24.509171 ignition[1786]: GET https://metadata.packet.net/metadata: attempt #6 May 27 03:51:24.982617 ignition[1786]: GET result: OK May 27 03:51:25.762384 ignition[1786]: Ignition finished successfully May 27 03:51:25.767274 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 03:51:25.769586 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 03:51:25.805516 ignition[1815]: Ignition 2.21.0 May 27 03:51:25.805523 ignition[1815]: Stage: disks May 27 03:51:25.805677 ignition[1815]: no configs at "/usr/lib/ignition/base.d" May 27 03:51:25.805685 ignition[1815]: no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 03:51:25.806634 ignition[1815]: disks: disks passed May 27 03:51:25.806638 ignition[1815]: POST message to Packet Timeline May 27 03:51:25.806656 ignition[1815]: GET https://metadata.packet.net/metadata: attempt #1 May 27 03:51:26.683103 ignition[1815]: GET result: OK May 27 03:51:27.126007 ignition[1815]: Ignition finished successfully May 27 03:51:27.128758 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 03:51:27.134817 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 03:51:27.142595 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 03:51:27.150803 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 03:51:27.159540 systemd[1]: Reached target sysinit.target - System Initialization. May 27 03:51:27.168704 systemd[1]: Reached target basic.target - Basic System. May 27 03:51:27.179219 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 03:51:27.215631 systemd-fsck[1840]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 27 03:51:27.218913 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 03:51:27.227289 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 03:51:27.321218 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 5656cec4-efbd-4a2d-be98-2263e6ae16bd r/w with ordered data mode. Quota mode: none. May 27 03:51:27.321600 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 03:51:27.332052 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 03:51:27.343287 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 03:51:27.364735 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 03:51:27.373211 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/nvme0n1p6 (259:6) scanned by mount (1857) May 27 03:51:27.374213 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 03:51:27.374225 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 03:51:27.374235 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 03:51:27.441512 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 27 03:51:27.448036 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... May 27 03:51:27.464196 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 03:51:27.464225 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 03:51:27.477637 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 03:51:27.518139 coreos-metadata[1877]: May 27 03:51:27.515 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 May 27 03:51:27.535029 coreos-metadata[1876]: May 27 03:51:27.515 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 May 27 03:51:27.500889 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 03:51:27.514816 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 03:51:27.574287 initrd-setup-root[1901]: cut: /sysroot/etc/passwd: No such file or directory May 27 03:51:27.580787 initrd-setup-root[1909]: cut: /sysroot/etc/group: No such file or directory May 27 03:51:27.587286 initrd-setup-root[1916]: cut: /sysroot/etc/shadow: No such file or directory May 27 03:51:27.593809 initrd-setup-root[1923]: cut: /sysroot/etc/gshadow: No such file or directory May 27 03:51:27.663983 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 03:51:27.676223 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 03:51:27.701884 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 03:51:27.710212 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 03:51:27.734945 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 03:51:27.744730 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 03:51:27.759072 ignition[1995]: INFO : Ignition 2.21.0 May 27 03:51:27.759072 ignition[1995]: INFO : Stage: mount May 27 03:51:27.770249 ignition[1995]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:51:27.770249 ignition[1995]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 03:51:27.770249 ignition[1995]: INFO : mount: mount passed May 27 03:51:27.770249 ignition[1995]: INFO : POST message to Packet Timeline May 27 03:51:27.770249 ignition[1995]: INFO : GET https://metadata.packet.net/metadata: attempt #1 May 27 03:51:28.019972 coreos-metadata[1877]: May 27 03:51:28.019 INFO Fetch successful May 27 03:51:28.024957 coreos-metadata[1876]: May 27 03:51:28.024 INFO Fetch successful May 27 03:51:28.067125 coreos-metadata[1876]: May 27 03:51:28.067 INFO wrote hostname ci-4344.0.0-a-636136d453 to /sysroot/etc/hostname May 27 03:51:28.067308 systemd[1]: flatcar-static-network.service: Deactivated successfully. May 27 03:51:28.068315 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. May 27 03:51:28.082278 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 03:51:28.316072 ignition[1995]: INFO : GET result: OK May 27 03:51:28.765869 ignition[1995]: INFO : Ignition finished successfully May 27 03:51:28.770316 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 03:51:28.777084 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 03:51:28.796320 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 03:51:28.850190 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/nvme0n1p6 (259:6) scanned by mount (2023) May 27 03:51:28.850238 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem eabe2c18-04ac-4289-8962-26387aada3f9 May 27 03:51:28.864541 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 03:51:28.877543 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 03:51:28.886647 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 03:51:28.920987 ignition[2040]: INFO : Ignition 2.21.0 May 27 03:51:28.920987 ignition[2040]: INFO : Stage: files May 27 03:51:28.930637 ignition[2040]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:51:28.930637 ignition[2040]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 03:51:28.930637 ignition[2040]: DEBUG : files: compiled without relabeling support, skipping May 27 03:51:28.930637 ignition[2040]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 03:51:28.930637 ignition[2040]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 03:51:28.930637 ignition[2040]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 03:51:28.930637 ignition[2040]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 03:51:28.930637 ignition[2040]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 03:51:28.930637 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" May 27 03:51:28.930637 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 May 27 03:51:28.927432 unknown[2040]: wrote ssh authorized keys file for user: core May 27 03:51:29.026600 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 03:51:29.107289 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 03:51:29.118175 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 May 27 03:51:29.445591 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 03:51:29.679774 ignition[2040]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" May 27 03:51:29.692218 ignition[2040]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 03:51:29.692218 ignition[2040]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 03:51:29.692218 ignition[2040]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 03:51:29.692218 ignition[2040]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 03:51:29.692218 ignition[2040]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 27 03:51:29.692218 ignition[2040]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 27 03:51:29.692218 ignition[2040]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 03:51:29.692218 ignition[2040]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 03:51:29.692218 ignition[2040]: INFO : files: files passed May 27 03:51:29.692218 ignition[2040]: INFO : POST message to Packet Timeline May 27 03:51:29.692218 ignition[2040]: INFO : GET https://metadata.packet.net/metadata: attempt #1 May 27 03:51:30.198452 ignition[2040]: INFO : GET result: OK May 27 03:51:30.561951 ignition[2040]: INFO : Ignition finished successfully May 27 03:51:30.565785 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 03:51:30.575389 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 03:51:30.598752 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 03:51:30.617453 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 03:51:30.618223 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 03:51:30.635565 initrd-setup-root-after-ignition[2081]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 03:51:30.635565 initrd-setup-root-after-ignition[2081]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 03:51:30.630101 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 03:51:30.687127 initrd-setup-root-after-ignition[2085]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 03:51:30.642964 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 03:51:30.659775 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 03:51:30.713117 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 03:51:30.714271 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 03:51:30.723012 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 03:51:30.733287 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 03:51:30.750057 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 03:51:30.751194 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 03:51:30.790162 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 03:51:30.802843 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 03:51:30.835336 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 03:51:30.847133 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:51:30.853029 systemd[1]: Stopped target timers.target - Timer Units. May 27 03:51:30.864493 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 03:51:30.864603 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 03:51:30.876179 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 03:51:30.887422 systemd[1]: Stopped target basic.target - Basic System. May 27 03:51:30.898790 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 03:51:30.910261 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 03:51:30.921443 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 03:51:30.932576 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 03:51:30.943765 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 03:51:30.954968 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 03:51:30.966213 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 03:51:30.977332 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 03:51:30.994138 systemd[1]: Stopped target swap.target - Swaps. May 27 03:51:31.005425 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 03:51:31.005525 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 03:51:31.016892 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 03:51:31.028108 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:51:31.039231 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 03:51:31.040292 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:51:31.050521 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 03:51:31.050621 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 03:51:31.062032 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 03:51:31.062128 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 03:51:31.073377 systemd[1]: Stopped target paths.target - Path Units. May 27 03:51:31.084705 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 03:51:31.086226 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:51:31.101821 systemd[1]: Stopped target slices.target - Slice Units. May 27 03:51:31.113418 systemd[1]: Stopped target sockets.target - Socket Units. May 27 03:51:31.125151 systemd[1]: iscsid.socket: Deactivated successfully. May 27 03:51:31.125238 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 03:51:31.136879 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 03:51:31.136972 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 03:51:31.243083 ignition[2109]: INFO : Ignition 2.21.0 May 27 03:51:31.243083 ignition[2109]: INFO : Stage: umount May 27 03:51:31.243083 ignition[2109]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:51:31.243083 ignition[2109]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 03:51:31.243083 ignition[2109]: INFO : umount: umount passed May 27 03:51:31.243083 ignition[2109]: INFO : POST message to Packet Timeline May 27 03:51:31.243083 ignition[2109]: INFO : GET https://metadata.packet.net/metadata: attempt #1 May 27 03:51:31.148642 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 03:51:31.148731 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 03:51:31.160263 systemd[1]: ignition-files.service: Deactivated successfully. May 27 03:51:31.160373 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 03:51:31.172051 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 27 03:51:31.172136 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 03:51:31.190314 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 03:51:31.215758 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 03:51:31.224698 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 03:51:31.224806 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:51:31.237306 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 03:51:31.237394 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 03:51:31.251658 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 03:51:31.253624 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 03:51:31.253704 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 03:51:31.295928 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 03:51:31.296136 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 03:51:31.864882 ignition[2109]: INFO : GET result: OK May 27 03:51:32.172581 ignition[2109]: INFO : Ignition finished successfully May 27 03:51:32.175280 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 03:51:32.175511 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 03:51:32.182599 systemd[1]: Stopped target network.target - Network. May 27 03:51:32.191802 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 03:51:32.191875 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 03:51:32.201405 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 03:51:32.201450 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 03:51:32.210990 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 03:51:32.211040 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 03:51:32.220831 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 03:51:32.220881 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 03:51:32.230613 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 03:51:32.230706 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 03:51:32.240676 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 03:51:32.250509 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 03:51:32.260730 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 03:51:32.260920 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 03:51:32.274570 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 03:51:32.274967 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 03:51:32.275906 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 03:51:32.286604 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 03:51:32.288324 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 03:51:32.295625 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 03:51:32.295743 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 03:51:32.307462 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 03:51:32.315716 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 03:51:32.315770 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 03:51:32.326053 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 03:51:32.326094 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 03:51:32.336556 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 03:51:32.336625 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 03:51:32.346825 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 03:51:32.346875 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:51:32.362870 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:51:32.374803 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 03:51:32.374920 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 03:51:32.385362 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 03:51:32.387251 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:51:32.396566 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 03:51:32.396622 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 03:51:32.407298 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 03:51:32.407321 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:51:32.418543 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 03:51:32.418618 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 03:51:32.430081 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 03:51:32.430160 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 03:51:32.441346 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 03:51:32.441387 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 03:51:32.459333 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 03:51:32.469987 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 03:51:32.470056 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:51:32.481859 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 03:51:32.481896 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:51:32.493882 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 03:51:32.493925 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 03:51:32.505751 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 03:51:32.505788 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:51:32.517548 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:51:32.517581 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:51:32.531134 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 03:51:32.531201 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 27 03:51:32.531234 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 03:51:32.531298 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 03:51:32.531618 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 03:51:32.531692 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 03:51:33.014011 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 03:51:33.014187 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 03:51:33.025475 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 03:51:33.036849 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 03:51:33.067742 systemd[1]: Switching root. May 27 03:51:33.135484 systemd-journald[910]: Journal stopped May 27 03:51:35.306850 systemd-journald[910]: Received SIGTERM from PID 1 (systemd). May 27 03:51:35.306876 kernel: SELinux: policy capability network_peer_controls=1 May 27 03:51:35.306887 kernel: SELinux: policy capability open_perms=1 May 27 03:51:35.306894 kernel: SELinux: policy capability extended_socket_class=1 May 27 03:51:35.306902 kernel: SELinux: policy capability always_check_network=0 May 27 03:51:35.306909 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 03:51:35.306917 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 03:51:35.306926 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 03:51:35.306933 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 03:51:35.306940 kernel: SELinux: policy capability userspace_initial_context=0 May 27 03:51:35.306948 kernel: audit: type=1403 audit(1748317893.341:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 03:51:35.306958 systemd[1]: Successfully loaded SELinux policy in 141.694ms. May 27 03:51:35.306967 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.556ms. May 27 03:51:35.306976 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 03:51:35.306987 systemd[1]: Detected architecture arm64. May 27 03:51:35.306995 systemd[1]: Detected first boot. May 27 03:51:35.307003 systemd[1]: Hostname set to . May 27 03:51:35.307012 systemd[1]: Initializing machine ID from random generator. May 27 03:51:35.307020 zram_generator::config[2173]: No configuration found. May 27 03:51:35.307033 systemd[1]: Populated /etc with preset unit settings. May 27 03:51:35.307042 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 03:51:35.307051 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 03:51:35.307059 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 03:51:35.307067 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 03:51:35.307076 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 03:51:35.307085 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 03:51:35.307094 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 03:51:35.307103 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 03:51:35.307112 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 03:51:35.307120 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 03:51:35.307129 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 03:51:35.307137 systemd[1]: Created slice user.slice - User and Session Slice. May 27 03:51:35.307146 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:51:35.307154 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:51:35.307164 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 03:51:35.307173 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 03:51:35.307181 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 03:51:35.307190 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 03:51:35.307199 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... May 27 03:51:35.307210 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:51:35.307221 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 03:51:35.307230 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 03:51:35.307240 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 03:51:35.307249 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 03:51:35.307257 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 03:51:35.307266 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:51:35.307275 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 03:51:35.307283 systemd[1]: Reached target slices.target - Slice Units. May 27 03:51:35.307292 systemd[1]: Reached target swap.target - Swaps. May 27 03:51:35.307302 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 03:51:35.307311 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 03:51:35.307320 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 03:51:35.307329 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 03:51:35.307338 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 03:51:35.307348 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:51:35.307357 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 03:51:35.307366 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 03:51:35.307375 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 03:51:35.307384 systemd[1]: Mounting media.mount - External Media Directory... May 27 03:51:35.307392 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 03:51:35.307401 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 03:51:35.307412 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 03:51:35.307422 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 03:51:35.307431 systemd[1]: Reached target machines.target - Containers. May 27 03:51:35.307440 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 03:51:35.307450 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:51:35.307459 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 03:51:35.307468 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 03:51:35.307477 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 03:51:35.307485 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 03:51:35.307494 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 03:51:35.307504 kernel: ACPI: bus type drm_connector registered May 27 03:51:35.307512 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 03:51:35.307521 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 03:51:35.307529 kernel: fuse: init (API version 7.41) May 27 03:51:35.307537 kernel: loop: module loaded May 27 03:51:35.307546 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 03:51:35.307555 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 03:51:35.307564 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 03:51:35.307574 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 03:51:35.307583 systemd[1]: Stopped systemd-fsck-usr.service. May 27 03:51:35.307592 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:51:35.307601 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 03:51:35.307610 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 03:51:35.307619 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 03:51:35.307628 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 03:51:35.307653 systemd-journald[2281]: Collecting audit messages is disabled. May 27 03:51:35.307675 systemd-journald[2281]: Journal started May 27 03:51:35.307694 systemd-journald[2281]: Runtime Journal (/run/log/journal/0f7c7669206143d6be00ec8bb6d26fd0) is 8M, max 4G, 3.9G free. May 27 03:51:33.884621 systemd[1]: Queued start job for default target multi-user.target. May 27 03:51:33.908822 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. May 27 03:51:33.909167 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 03:51:33.909458 systemd[1]: systemd-journald.service: Consumed 3.410s CPU time. May 27 03:51:35.338225 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 03:51:35.359220 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 03:51:35.382534 systemd[1]: verity-setup.service: Deactivated successfully. May 27 03:51:35.382551 systemd[1]: Stopped verity-setup.service. May 27 03:51:35.408232 systemd[1]: Started systemd-journald.service - Journal Service. May 27 03:51:35.413707 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 03:51:35.419504 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 03:51:35.425005 systemd[1]: Mounted media.mount - External Media Directory. May 27 03:51:35.430442 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 03:51:35.435942 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 03:51:35.441411 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 03:51:35.447471 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 03:51:35.453120 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:51:35.458569 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 03:51:35.458731 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 03:51:35.464103 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 03:51:35.464258 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 03:51:35.470783 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 03:51:35.470951 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 03:51:35.476234 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 03:51:35.476409 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 03:51:35.481961 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 03:51:35.482138 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 03:51:35.487429 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 03:51:35.488312 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 03:51:35.493751 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 03:51:35.498820 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:51:35.505230 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 03:51:35.512228 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 03:51:35.527094 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 03:51:35.533393 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 03:51:35.556896 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 03:51:35.561846 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 03:51:35.561874 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 03:51:35.567548 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 03:51:35.573328 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 03:51:35.578180 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:51:35.579416 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 03:51:35.585106 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 03:51:35.589955 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 03:51:35.590972 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 03:51:35.595803 systemd-journald[2281]: Time spent on flushing to /var/log/journal/0f7c7669206143d6be00ec8bb6d26fd0 is 25.128ms for 2478 entries. May 27 03:51:35.595803 systemd-journald[2281]: System Journal (/var/log/journal/0f7c7669206143d6be00ec8bb6d26fd0) is 8M, max 195.6M, 187.6M free. May 27 03:51:35.638097 systemd-journald[2281]: Received client request to flush runtime journal. May 27 03:51:35.638143 kernel: loop0: detected capacity change from 0 to 107312 May 27 03:51:35.595795 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 03:51:35.596953 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 03:51:35.613967 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 03:51:35.619695 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 03:51:35.625810 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 03:51:35.641460 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 03:51:35.645857 systemd-tmpfiles[2324]: ACLs are not supported, ignoring. May 27 03:51:35.645869 systemd-tmpfiles[2324]: ACLs are not supported, ignoring. May 27 03:51:35.652211 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 03:51:35.656485 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 03:51:35.661295 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:51:35.667228 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 03:51:35.672013 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 03:51:35.676997 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 03:51:35.685567 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 03:51:35.691482 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 03:51:35.716216 kernel: loop1: detected capacity change from 0 to 8 May 27 03:51:35.719096 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 03:51:35.726536 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 03:51:35.727124 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 03:51:35.744844 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 03:51:35.751257 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 03:51:35.768797 systemd-tmpfiles[2350]: ACLs are not supported, ignoring. May 27 03:51:35.768809 systemd-tmpfiles[2350]: ACLs are not supported, ignoring. May 27 03:51:35.773169 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:51:35.779216 kernel: loop2: detected capacity change from 0 to 138376 May 27 03:51:35.833337 ldconfig[2314]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 03:51:35.835029 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 03:51:35.839214 kernel: loop3: detected capacity change from 0 to 207008 May 27 03:51:35.912222 kernel: loop4: detected capacity change from 0 to 107312 May 27 03:51:35.928231 kernel: loop5: detected capacity change from 0 to 8 May 27 03:51:35.940217 kernel: loop6: detected capacity change from 0 to 138376 May 27 03:51:35.967215 kernel: loop7: detected capacity change from 0 to 207008 May 27 03:51:35.973640 (sd-merge)[2359]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. May 27 03:51:35.974105 (sd-merge)[2359]: Merged extensions into '/usr'. May 27 03:51:35.974697 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 03:51:35.983468 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:51:35.988616 systemd[1]: Reload requested from client PID 2322 ('systemd-sysext') (unit systemd-sysext.service)... May 27 03:51:35.988626 systemd[1]: Reloading... May 27 03:51:36.035213 zram_generator::config[2387]: No configuration found. May 27 03:51:36.042363 systemd-udevd[2361]: Using default interface naming scheme 'v255'. May 27 03:51:36.115219 kernel: IPMI message handler: version 39.2 May 27 03:51:36.115329 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:51:36.125214 kernel: ipmi device interface May 27 03:51:36.134218 kernel: MACsec IEEE 802.1AE May 27 03:51:36.139217 kernel: ipmi_ssif: IPMI SSIF Interface driver May 27 03:51:36.139245 kernel: ipmi_si: IPMI System Interface driver May 27 03:51:36.158280 kernel: ipmi_si: Unable to find any System Interface(s) May 27 03:51:36.207641 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. May 27 03:51:36.207999 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - SAMSUNG MZ1LB960HAJQ-00007 OEM. May 27 03:51:36.212661 systemd[1]: Reloading finished in 223 ms. May 27 03:51:36.243652 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:51:36.248473 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 03:51:36.302621 systemd[1]: Starting ensure-sysext.service... May 27 03:51:36.308131 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 03:51:36.314767 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 03:51:36.320568 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 03:51:36.326417 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:51:36.336427 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 03:51:36.339088 systemd-tmpfiles[2577]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 03:51:36.339114 systemd-tmpfiles[2577]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 03:51:36.339315 systemd-tmpfiles[2577]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 03:51:36.339496 systemd-tmpfiles[2577]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 03:51:36.340058 systemd-tmpfiles[2577]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 03:51:36.340254 systemd-tmpfiles[2577]: ACLs are not supported, ignoring. May 27 03:51:36.340294 systemd-tmpfiles[2577]: ACLs are not supported, ignoring. May 27 03:51:36.342983 systemd-tmpfiles[2577]: Detected autofs mount point /boot during canonicalization of boot. May 27 03:51:36.342990 systemd-tmpfiles[2577]: Skipping /boot May 27 03:51:36.344036 systemd[1]: Reload requested from client PID 2573 ('systemctl') (unit ensure-sysext.service)... May 27 03:51:36.344053 systemd[1]: Reloading... May 27 03:51:36.351730 systemd-tmpfiles[2577]: Detected autofs mount point /boot during canonicalization of boot. May 27 03:51:36.351738 systemd-tmpfiles[2577]: Skipping /boot May 27 03:51:36.390217 zram_generator::config[2615]: No configuration found. May 27 03:51:36.463203 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:51:36.555397 systemd[1]: Reloading finished in 211 ms. May 27 03:51:36.590414 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:51:36.595510 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:51:36.606671 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 03:51:36.621232 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 03:51:36.627376 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 03:51:36.634220 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 03:51:36.640285 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 03:51:36.647203 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 03:51:36.653625 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 03:51:36.658930 augenrules[2695]: No rules May 27 03:51:36.658979 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 03:51:36.663815 systemd[1]: audit-rules.service: Deactivated successfully. May 27 03:51:36.664027 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 03:51:36.675401 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 03:51:36.683311 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 03:51:36.698108 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 03:51:36.703045 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:51:36.704347 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 03:51:36.710366 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 03:51:36.716450 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 03:51:36.720141 augenrules[2710]: /sbin/augenrules: No change May 27 03:51:36.722463 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 03:51:36.727566 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:51:36.727670 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:51:36.728055 augenrules[2733]: No rules May 27 03:51:36.728965 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 03:51:36.733902 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 03:51:36.737061 systemd[1]: audit-rules.service: Deactivated successfully. May 27 03:51:36.737292 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 03:51:36.742363 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 03:51:36.742512 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 03:51:36.747549 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 03:51:36.748296 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 03:51:36.751204 systemd-resolved[2681]: Positive Trust Anchors: May 27 03:51:36.751219 systemd-resolved[2681]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 03:51:36.751250 systemd-resolved[2681]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 03:51:36.753513 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 03:51:36.753696 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 03:51:36.754810 systemd-resolved[2681]: Using system hostname 'ci-4344.0.0-a-636136d453'. May 27 03:51:36.758791 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 03:51:36.761866 systemd-networkd[2575]: lo: Link UP May 27 03:51:36.761872 systemd-networkd[2575]: lo: Gained carrier May 27 03:51:36.763680 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 03:51:36.763851 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 03:51:36.765576 systemd-networkd[2575]: bond0: netdev ready May 27 03:51:36.769082 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 03:51:36.775294 systemd-networkd[2575]: Enumeration completed May 27 03:51:36.775552 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 03:51:36.780629 systemd[1]: Finished ensure-sysext.service. May 27 03:51:36.783968 systemd-networkd[2575]: enP1p1s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:5a:06:d8.network. May 27 03:51:36.790942 systemd[1]: Reached target network.target - Network. May 27 03:51:36.795508 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 03:51:36.801586 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 03:51:36.807378 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 03:51:36.812054 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 03:51:36.812116 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 03:51:36.829867 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 27 03:51:36.877714 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 27 03:51:36.882317 systemd[1]: Reached target sysinit.target - System Initialization. May 27 03:51:36.886848 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 03:51:36.891256 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 03:51:36.895670 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 03:51:36.900139 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 03:51:36.900158 systemd[1]: Reached target paths.target - Path Units. May 27 03:51:36.904615 systemd[1]: Reached target time-set.target - System Time Set. May 27 03:51:36.909140 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 03:51:36.913798 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 03:51:36.918172 systemd[1]: Reached target timers.target - Timer Units. May 27 03:51:36.923284 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 03:51:36.929045 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 03:51:36.935328 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 03:51:36.943186 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 03:51:36.948095 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 03:51:36.952977 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 03:51:36.957427 systemd[1]: Reached target sockets.target - Socket Units. May 27 03:51:36.961765 systemd[1]: Reached target basic.target - Basic System. May 27 03:51:36.966037 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 03:51:36.966059 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 03:51:36.967048 systemd[1]: Starting containerd.service - containerd container runtime... May 27 03:51:36.984991 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 03:51:36.990582 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 03:51:36.996120 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 03:51:37.001711 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 03:51:37.006743 coreos-metadata[2760]: May 27 03:51:37.006 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 May 27 03:51:37.007274 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 03:51:37.011257 coreos-metadata[2760]: May 27 03:51:37.011 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) May 27 03:51:37.011826 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 03:51:37.012387 jq[2765]: false May 27 03:51:37.012912 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 03:51:37.018495 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 03:51:37.024313 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 03:51:37.027281 extend-filesystems[2767]: Found loop4 May 27 03:51:37.033919 extend-filesystems[2767]: Found loop5 May 27 03:51:37.033919 extend-filesystems[2767]: Found loop6 May 27 03:51:37.033919 extend-filesystems[2767]: Found loop7 May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme1n1 May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme0n1 May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme0n1p1 May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme0n1p2 May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme0n1p3 May 27 03:51:37.033919 extend-filesystems[2767]: Found usr May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme0n1p4 May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme0n1p6 May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme0n1p7 May 27 03:51:37.033919 extend-filesystems[2767]: Found nvme0n1p9 May 27 03:51:37.033919 extend-filesystems[2767]: Checking size of /dev/nvme0n1p9 May 27 03:51:37.151621 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 233815889 blocks May 27 03:51:37.030175 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 03:51:37.151744 extend-filesystems[2767]: Resized partition /dev/nvme0n1p9 May 27 03:51:37.042603 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 03:51:37.156107 extend-filesystems[2784]: resize2fs 1.47.2 (1-Jan-2025) May 27 03:51:37.049299 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 03:51:37.158666 dbus-daemon[2761]: [system] SELinux support is enabled May 27 03:51:37.049845 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 03:51:37.161047 update_engine[2788]: I20250527 03:51:37.113854 2788 main.cc:92] Flatcar Update Engine starting May 27 03:51:37.050423 systemd[1]: Starting update-engine.service - Update Engine... May 27 03:51:37.072387 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 03:51:37.161416 jq[2794]: true May 27 03:51:37.083238 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 03:51:37.090286 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 03:51:37.161815 tar[2798]: linux-arm64/LICENSE May 27 03:51:37.161815 tar[2798]: linux-arm64/helm May 27 03:51:37.090479 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 03:51:37.162102 jq[2799]: true May 27 03:51:37.090786 systemd[1]: motdgen.service: Deactivated successfully. May 27 03:51:37.092241 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 03:51:37.100271 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 03:51:37.100442 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 03:51:37.113629 systemd-logind[2785]: Watching system buttons on /dev/input/event0 (Power Button) May 27 03:51:37.113893 systemd-logind[2785]: New seat seat0. May 27 03:51:37.119892 (ntainerd)[2800]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 03:51:37.126350 systemd[1]: Started systemd-logind.service - User Login Management. May 27 03:51:37.158839 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 03:51:37.163623 update_engine[2788]: I20250527 03:51:37.163578 2788 update_check_scheduler.cc:74] Next update check in 7m17s May 27 03:51:37.169249 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 03:51:37.169280 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 03:51:37.169825 dbus-daemon[2761]: [system] Successfully activated service 'org.freedesktop.systemd1' May 27 03:51:37.173621 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 03:51:37.173636 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 03:51:37.176298 bash[2822]: Updated "/home/core/.ssh/authorized_keys" May 27 03:51:37.178103 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 03:51:37.183238 systemd[1]: Started update-engine.service - Update Engine. May 27 03:51:37.189665 systemd[1]: Starting sshkeys.service... May 27 03:51:37.211689 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 03:51:37.220726 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 27 03:51:37.226274 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 27 03:51:37.245549 coreos-metadata[2837]: May 27 03:51:37.245 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 May 27 03:51:37.245851 locksmithd[2829]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 03:51:37.246639 coreos-metadata[2837]: May 27 03:51:37.246 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) May 27 03:51:37.291642 containerd[2800]: time="2025-05-27T03:51:37Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 03:51:37.292235 containerd[2800]: time="2025-05-27T03:51:37.292199160Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 03:51:37.299576 containerd[2800]: time="2025-05-27T03:51:37.299545920Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.24µs" May 27 03:51:37.299594 containerd[2800]: time="2025-05-27T03:51:37.299577200Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 03:51:37.299610 containerd[2800]: time="2025-05-27T03:51:37.299594600Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 03:51:37.299775 containerd[2800]: time="2025-05-27T03:51:37.299762720Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 03:51:37.299796 containerd[2800]: time="2025-05-27T03:51:37.299779240Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 03:51:37.299815 containerd[2800]: time="2025-05-27T03:51:37.299801160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 03:51:37.299866 containerd[2800]: time="2025-05-27T03:51:37.299852760Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 03:51:37.299885 containerd[2800]: time="2025-05-27T03:51:37.299866080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 03:51:37.300072 containerd[2800]: time="2025-05-27T03:51:37.300057080Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 03:51:37.300092 containerd[2800]: time="2025-05-27T03:51:37.300073440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 03:51:37.300092 containerd[2800]: time="2025-05-27T03:51:37.300084400Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 03:51:37.300130 containerd[2800]: time="2025-05-27T03:51:37.300093680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 03:51:37.300175 containerd[2800]: time="2025-05-27T03:51:37.300164200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 03:51:37.300378 containerd[2800]: time="2025-05-27T03:51:37.300363840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 03:51:37.300407 containerd[2800]: time="2025-05-27T03:51:37.300395480Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 03:51:37.300426 containerd[2800]: time="2025-05-27T03:51:37.300406920Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 03:51:37.300933 containerd[2800]: time="2025-05-27T03:51:37.300913520Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 03:51:37.301147 containerd[2800]: time="2025-05-27T03:51:37.301136440Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 03:51:37.301244 containerd[2800]: time="2025-05-27T03:51:37.301228640Z" level=info msg="metadata content store policy set" policy=shared May 27 03:51:37.308575 containerd[2800]: time="2025-05-27T03:51:37.308546680Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 03:51:37.308623 containerd[2800]: time="2025-05-27T03:51:37.308586720Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 03:51:37.308623 containerd[2800]: time="2025-05-27T03:51:37.308600120Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 03:51:37.308623 containerd[2800]: time="2025-05-27T03:51:37.308611640Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 03:51:37.308623 containerd[2800]: time="2025-05-27T03:51:37.308623360Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 03:51:37.308750 containerd[2800]: time="2025-05-27T03:51:37.308635640Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 03:51:37.308750 containerd[2800]: time="2025-05-27T03:51:37.308646840Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 03:51:37.308750 containerd[2800]: time="2025-05-27T03:51:37.308657640Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 03:51:37.308750 containerd[2800]: time="2025-05-27T03:51:37.308668800Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 03:51:37.308750 containerd[2800]: time="2025-05-27T03:51:37.308678760Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 03:51:37.308750 containerd[2800]: time="2025-05-27T03:51:37.308687280Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 03:51:37.308750 containerd[2800]: time="2025-05-27T03:51:37.308699640Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 03:51:37.308854 containerd[2800]: time="2025-05-27T03:51:37.308814840Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 03:51:37.308854 containerd[2800]: time="2025-05-27T03:51:37.308835240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 03:51:37.308854 containerd[2800]: time="2025-05-27T03:51:37.308850760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 03:51:37.308901 containerd[2800]: time="2025-05-27T03:51:37.308860440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 03:51:37.308901 containerd[2800]: time="2025-05-27T03:51:37.308871800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 03:51:37.308901 containerd[2800]: time="2025-05-27T03:51:37.308881520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 03:51:37.308901 containerd[2800]: time="2025-05-27T03:51:37.308891280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 03:51:37.308901 containerd[2800]: time="2025-05-27T03:51:37.308901280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 03:51:37.308985 containerd[2800]: time="2025-05-27T03:51:37.308912280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 03:51:37.308985 containerd[2800]: time="2025-05-27T03:51:37.308922120Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 03:51:37.308985 containerd[2800]: time="2025-05-27T03:51:37.308931120Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 03:51:37.309126 containerd[2800]: time="2025-05-27T03:51:37.309110640Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 03:51:37.309148 containerd[2800]: time="2025-05-27T03:51:37.309127480Z" level=info msg="Start snapshots syncer" May 27 03:51:37.309169 containerd[2800]: time="2025-05-27T03:51:37.309150120Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 03:51:37.309383 containerd[2800]: time="2025-05-27T03:51:37.309351360Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 03:51:37.309462 containerd[2800]: time="2025-05-27T03:51:37.309394400Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 03:51:37.309482 containerd[2800]: time="2025-05-27T03:51:37.309463680Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 03:51:37.309591 containerd[2800]: time="2025-05-27T03:51:37.309575560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 03:51:37.309611 containerd[2800]: time="2025-05-27T03:51:37.309597160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 03:51:37.309611 containerd[2800]: time="2025-05-27T03:51:37.309608440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 03:51:37.309654 containerd[2800]: time="2025-05-27T03:51:37.309624080Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 03:51:37.309654 containerd[2800]: time="2025-05-27T03:51:37.309635360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 03:51:37.309654 containerd[2800]: time="2025-05-27T03:51:37.309645320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 03:51:37.309701 containerd[2800]: time="2025-05-27T03:51:37.309655120Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 03:51:37.309701 containerd[2800]: time="2025-05-27T03:51:37.309678040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 03:51:37.309701 containerd[2800]: time="2025-05-27T03:51:37.309688640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 03:51:37.309701 containerd[2800]: time="2025-05-27T03:51:37.309698120Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 03:51:37.309764 containerd[2800]: time="2025-05-27T03:51:37.309731440Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 03:51:37.309764 containerd[2800]: time="2025-05-27T03:51:37.309743880Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 03:51:37.309764 containerd[2800]: time="2025-05-27T03:51:37.309752800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 03:51:37.309764 containerd[2800]: time="2025-05-27T03:51:37.309762080Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 03:51:37.309833 containerd[2800]: time="2025-05-27T03:51:37.309769600Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 03:51:37.309833 containerd[2800]: time="2025-05-27T03:51:37.309779320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 03:51:37.309833 containerd[2800]: time="2025-05-27T03:51:37.309789240Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 03:51:37.309882 containerd[2800]: time="2025-05-27T03:51:37.309865520Z" level=info msg="runtime interface created" May 27 03:51:37.309882 containerd[2800]: time="2025-05-27T03:51:37.309870480Z" level=info msg="created NRI interface" May 27 03:51:37.309882 containerd[2800]: time="2025-05-27T03:51:37.309877960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 03:51:37.309928 containerd[2800]: time="2025-05-27T03:51:37.309887840Z" level=info msg="Connect containerd service" May 27 03:51:37.309928 containerd[2800]: time="2025-05-27T03:51:37.309911360Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 03:51:37.310545 containerd[2800]: time="2025-05-27T03:51:37.310521440Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 03:51:37.343037 sshd_keygen[2789]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 03:51:37.363247 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 03:51:37.369710 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 03:51:37.391956 containerd[2800]: time="2025-05-27T03:51:37.391908800Z" level=info msg="Start subscribing containerd event" May 27 03:51:37.391992 containerd[2800]: time="2025-05-27T03:51:37.391976120Z" level=info msg="Start recovering state" May 27 03:51:37.392065 containerd[2800]: time="2025-05-27T03:51:37.392055840Z" level=info msg="Start event monitor" May 27 03:51:37.392084 containerd[2800]: time="2025-05-27T03:51:37.392073360Z" level=info msg="Start cni network conf syncer for default" May 27 03:51:37.392101 containerd[2800]: time="2025-05-27T03:51:37.392081960Z" level=info msg="Start streaming server" May 27 03:51:37.392101 containerd[2800]: time="2025-05-27T03:51:37.392091440Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 03:51:37.392101 containerd[2800]: time="2025-05-27T03:51:37.392098480Z" level=info msg="runtime interface starting up..." May 27 03:51:37.392161 containerd[2800]: time="2025-05-27T03:51:37.392103920Z" level=info msg="starting plugins..." May 27 03:51:37.392161 containerd[2800]: time="2025-05-27T03:51:37.392116680Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 03:51:37.392228 containerd[2800]: time="2025-05-27T03:51:37.392194440Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 03:51:37.392273 containerd[2800]: time="2025-05-27T03:51:37.392262360Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 03:51:37.392326 containerd[2800]: time="2025-05-27T03:51:37.392317680Z" level=info msg="containerd successfully booted in 0.101013s" May 27 03:51:37.392346 systemd[1]: Started containerd.service - containerd container runtime. May 27 03:51:37.397600 systemd[1]: issuegen.service: Deactivated successfully. May 27 03:51:37.397818 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 03:51:37.404751 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 03:51:37.437249 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 03:51:37.443484 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 03:51:37.449329 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. May 27 03:51:37.454215 systemd[1]: Reached target getty.target - Login Prompts. May 27 03:51:37.479139 tar[2798]: linux-arm64/README.md May 27 03:51:37.500930 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 03:51:37.660225 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 233815889 May 27 03:51:37.677696 extend-filesystems[2784]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required May 27 03:51:37.677696 extend-filesystems[2784]: old_desc_blocks = 1, new_desc_blocks = 112 May 27 03:51:37.677696 extend-filesystems[2784]: The filesystem on /dev/nvme0n1p9 is now 233815889 (4k) blocks long. May 27 03:51:37.725615 kernel: mlx5_core 0001:01:00.0 enP1p1s0f0np0: Link up May 27 03:51:37.725813 kernel: bond0: (slave enP1p1s0f0np0): Enslaving as a backup interface with an up link May 27 03:51:37.725827 extend-filesystems[2767]: Resized filesystem in /dev/nvme0n1p9 May 27 03:51:37.679974 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 03:51:37.681295 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 03:51:37.691060 systemd[1]: extend-filesystems.service: Consumed 217ms CPU time, 68.9M memory peak. May 27 03:51:37.722796 systemd-networkd[2575]: enP1p1s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:5a:06:d9.network. May 27 03:51:38.011429 coreos-metadata[2760]: May 27 03:51:38.011 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 May 27 03:51:38.011897 coreos-metadata[2760]: May 27 03:51:38.011 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) May 27 03:51:38.246770 coreos-metadata[2837]: May 27 03:51:38.246 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 May 27 03:51:38.247155 coreos-metadata[2837]: May 27 03:51:38.247 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) May 27 03:51:38.286221 kernel: mlx5_core 0001:01:00.1 enP1p1s0f1np1: Link up May 27 03:51:38.303071 systemd-networkd[2575]: bond0: Configuring with /etc/systemd/network/05-bond0.network. May 27 03:51:38.303246 kernel: bond0: (slave enP1p1s0f1np1): Enslaving as a backup interface with an up link May 27 03:51:38.304242 systemd-networkd[2575]: enP1p1s0f0np0: Link UP May 27 03:51:38.304482 systemd-networkd[2575]: enP1p1s0f0np0: Gained carrier May 27 03:51:38.305229 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 03:51:38.323215 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond May 27 03:51:38.334633 systemd-networkd[2575]: enP1p1s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:5a:06:d8.network. May 27 03:51:38.334918 systemd-networkd[2575]: enP1p1s0f1np1: Link UP May 27 03:51:38.335114 systemd-networkd[2575]: enP1p1s0f1np1: Gained carrier May 27 03:51:38.348393 systemd-networkd[2575]: bond0: Link UP May 27 03:51:38.348669 systemd-networkd[2575]: bond0: Gained carrier May 27 03:51:38.348854 systemd-timesyncd[2756]: Network configuration changed, trying to establish connection. May 27 03:51:38.430811 kernel: bond0: (slave enP1p1s0f0np0): link status definitely up, 25000 Mbps full duplex May 27 03:51:38.430846 kernel: bond0: active interface up! May 27 03:51:38.554218 kernel: bond0: (slave enP1p1s0f1np1): link status definitely up, 25000 Mbps full duplex May 27 03:51:40.012005 coreos-metadata[2760]: May 27 03:51:40.011 INFO Fetching https://metadata.packet.net/metadata: Attempt #3 May 27 03:51:40.247289 coreos-metadata[2837]: May 27 03:51:40.247 INFO Fetching https://metadata.packet.net/metadata: Attempt #3 May 27 03:51:40.339277 systemd-networkd[2575]: bond0: Gained IPv6LL May 27 03:51:40.342191 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 03:51:40.348090 systemd[1]: Reached target network-online.target - Network is Online. May 27 03:51:40.355240 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:51:40.385634 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 03:51:40.407000 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 03:51:41.003833 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:51:41.010007 (kubelet)[2914]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:51:41.368968 kubelet[2914]: E0527 03:51:41.368905 2914 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:51:41.371459 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:51:41.371581 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:51:41.371924 systemd[1]: kubelet.service: Consumed 727ms CPU time, 259.9M memory peak. May 27 03:51:41.449549 kernel: mlx5_core 0001:01:00.0: lag map: port 1:1 port 2:2 May 27 03:51:41.449790 kernel: mlx5_core 0001:01:00.0: shared_fdb:0 mode:queue_affinity May 27 03:51:40.232394 systemd-resolved[2681]: Clock change detected. Flushing caches. May 27 03:51:40.245190 systemd-journald[2281]: Time jumped backwards, rotating. May 27 03:51:40.232513 systemd-timesyncd[2756]: Contacted time server 69.89.207.199:123 (0.flatcar.pool.ntp.org). May 27 03:51:40.232582 systemd-timesyncd[2756]: Initial clock synchronization to Tue 2025-05-27 03:51:40.232331 UTC. May 27 03:51:40.526109 coreos-metadata[2837]: May 27 03:51:40.526 INFO Fetch successful May 27 03:51:40.572538 unknown[2837]: wrote ssh authorized keys file for user: core May 27 03:51:40.593312 coreos-metadata[2760]: May 27 03:51:40.593 INFO Fetch successful May 27 03:51:40.600637 update-ssh-keys[2938]: Updated "/home/core/.ssh/authorized_keys" May 27 03:51:40.602364 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 27 03:51:40.609590 systemd[1]: Finished sshkeys.service. May 27 03:51:40.660025 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 03:51:40.666946 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... May 27 03:51:40.857965 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 03:51:40.864153 systemd[1]: Started sshd@0-147.28.163.138:22-139.178.89.65:36808.service - OpenSSH per-connection server daemon (139.178.89.65:36808). May 27 03:51:40.941778 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. May 27 03:51:40.947507 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 03:51:40.952947 systemd[1]: Startup finished in 4.991s (kernel) + 19.900s (initrd) + 9.247s (userspace) = 34.140s. May 27 03:51:40.994090 login[2887]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying May 27 03:51:40.995537 login[2888]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:41.000930 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 03:51:41.002067 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 03:51:41.006939 systemd-logind[2785]: New session 1 of user core. May 27 03:51:41.013739 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 03:51:41.017501 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 03:51:41.023224 (systemd)[2961]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 03:51:41.024912 systemd-logind[2785]: New session c1 of user core. May 27 03:51:41.153848 systemd[2961]: Queued start job for default target default.target. May 27 03:51:41.173695 systemd[2961]: Created slice app.slice - User Application Slice. May 27 03:51:41.173720 systemd[2961]: Reached target paths.target - Paths. May 27 03:51:41.173754 systemd[2961]: Reached target timers.target - Timers. May 27 03:51:41.174922 systemd[2961]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 03:51:41.182929 systemd[2961]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 03:51:41.182982 systemd[2961]: Reached target sockets.target - Sockets. May 27 03:51:41.183020 systemd[2961]: Reached target basic.target - Basic System. May 27 03:51:41.183048 systemd[2961]: Reached target default.target - Main User Target. May 27 03:51:41.183073 systemd[2961]: Startup finished in 153ms. May 27 03:51:41.183430 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 03:51:41.184867 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 03:51:41.302195 sshd[2951]: Accepted publickey for core from 139.178.89.65 port 36808 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 03:51:41.303479 sshd-session[2951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:41.306486 systemd-logind[2785]: New session 3 of user core. May 27 03:51:41.327782 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 03:51:41.671375 systemd[1]: Started sshd@1-147.28.163.138:22-139.178.89.65:36810.service - OpenSSH per-connection server daemon (139.178.89.65:36810). May 27 03:51:41.994743 login[2887]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:41.999074 systemd-logind[2785]: New session 2 of user core. May 27 03:51:42.013832 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 03:51:42.089339 sshd[2985]: Accepted publickey for core from 139.178.89.65 port 36810 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 03:51:42.090642 sshd-session[2985]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:42.093447 systemd-logind[2785]: New session 4 of user core. May 27 03:51:42.114775 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 03:51:42.389341 sshd[2997]: Connection closed by 139.178.89.65 port 36810 May 27 03:51:42.389805 sshd-session[2985]: pam_unix(sshd:session): session closed for user core May 27 03:51:42.393607 systemd[1]: sshd@1-147.28.163.138:22-139.178.89.65:36810.service: Deactivated successfully. May 27 03:51:42.395339 systemd[1]: session-4.scope: Deactivated successfully. May 27 03:51:42.395908 systemd-logind[2785]: Session 4 logged out. Waiting for processes to exit. May 27 03:51:42.396788 systemd-logind[2785]: Removed session 4. May 27 03:51:42.476255 systemd[1]: Started sshd@2-147.28.163.138:22-139.178.89.65:36820.service - OpenSSH per-connection server daemon (139.178.89.65:36820). May 27 03:51:42.889316 sshd[3003]: Accepted publickey for core from 139.178.89.65 port 36820 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 03:51:42.890462 sshd-session[3003]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:42.893370 systemd-logind[2785]: New session 5 of user core. May 27 03:51:42.915828 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 03:51:43.184770 sshd[3005]: Connection closed by 139.178.89.65 port 36820 May 27 03:51:43.185212 sshd-session[3003]: pam_unix(sshd:session): session closed for user core May 27 03:51:43.188994 systemd[1]: sshd@2-147.28.163.138:22-139.178.89.65:36820.service: Deactivated successfully. May 27 03:51:43.191113 systemd[1]: session-5.scope: Deactivated successfully. May 27 03:51:43.191652 systemd-logind[2785]: Session 5 logged out. Waiting for processes to exit. May 27 03:51:43.192524 systemd-logind[2785]: Removed session 5. May 27 03:51:43.266297 systemd[1]: Started sshd@3-147.28.163.138:22-139.178.89.65:40000.service - OpenSSH per-connection server daemon (139.178.89.65:40000). May 27 03:51:43.671704 sshd[3011]: Accepted publickey for core from 139.178.89.65 port 40000 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 03:51:43.672827 sshd-session[3011]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:43.675739 systemd-logind[2785]: New session 6 of user core. May 27 03:51:43.698780 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 03:51:43.964583 sshd[3013]: Connection closed by 139.178.89.65 port 40000 May 27 03:51:43.964977 sshd-session[3011]: pam_unix(sshd:session): session closed for user core May 27 03:51:43.968466 systemd[1]: sshd@3-147.28.163.138:22-139.178.89.65:40000.service: Deactivated successfully. May 27 03:51:43.970099 systemd[1]: session-6.scope: Deactivated successfully. May 27 03:51:43.970621 systemd-logind[2785]: Session 6 logged out. Waiting for processes to exit. May 27 03:51:43.971515 systemd-logind[2785]: Removed session 6. May 27 03:51:44.045322 systemd[1]: Started sshd@4-147.28.163.138:22-139.178.89.65:40002.service - OpenSSH per-connection server daemon (139.178.89.65:40002). May 27 03:51:44.449186 sshd[3021]: Accepted publickey for core from 139.178.89.65 port 40002 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 03:51:44.450305 sshd-session[3021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:44.453192 systemd-logind[2785]: New session 7 of user core. May 27 03:51:44.474771 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 03:51:44.687879 sudo[3024]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 03:51:44.688131 sudo[3024]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:51:44.714050 sudo[3024]: pam_unix(sudo:session): session closed for user root May 27 03:51:44.776129 sshd[3023]: Connection closed by 139.178.89.65 port 40002 May 27 03:51:44.776484 sshd-session[3021]: pam_unix(sshd:session): session closed for user core May 27 03:51:44.779532 systemd[1]: sshd@4-147.28.163.138:22-139.178.89.65:40002.service: Deactivated successfully. May 27 03:51:44.782157 systemd[1]: session-7.scope: Deactivated successfully. May 27 03:51:44.782747 systemd-logind[2785]: Session 7 logged out. Waiting for processes to exit. May 27 03:51:44.783572 systemd-logind[2785]: Removed session 7. May 27 03:51:44.858376 systemd[1]: Started sshd@5-147.28.163.138:22-139.178.89.65:40014.service - OpenSSH per-connection server daemon (139.178.89.65:40014). May 27 03:51:45.271972 sshd[3031]: Accepted publickey for core from 139.178.89.65 port 40014 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 03:51:45.273167 sshd-session[3031]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:45.275991 systemd-logind[2785]: New session 8 of user core. May 27 03:51:45.296767 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 03:51:45.507568 sudo[3035]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 03:51:45.507830 sudo[3035]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:51:45.510704 sudo[3035]: pam_unix(sudo:session): session closed for user root May 27 03:51:45.514859 sudo[3034]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 03:51:45.515102 sudo[3034]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:51:45.522039 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 03:51:45.568574 augenrules[3057]: No rules May 27 03:51:45.569624 systemd[1]: audit-rules.service: Deactivated successfully. May 27 03:51:45.569847 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 03:51:45.570548 sudo[3034]: pam_unix(sudo:session): session closed for user root May 27 03:51:45.633812 sshd[3033]: Connection closed by 139.178.89.65 port 40014 May 27 03:51:45.634206 sshd-session[3031]: pam_unix(sshd:session): session closed for user core May 27 03:51:45.637028 systemd[1]: sshd@5-147.28.163.138:22-139.178.89.65:40014.service: Deactivated successfully. May 27 03:51:45.638826 systemd[1]: session-8.scope: Deactivated successfully. May 27 03:51:45.640186 systemd-logind[2785]: Session 8 logged out. Waiting for processes to exit. May 27 03:51:45.640954 systemd-logind[2785]: Removed session 8. May 27 03:51:45.719238 systemd[1]: Started sshd@6-147.28.163.138:22-139.178.89.65:40018.service - OpenSSH per-connection server daemon (139.178.89.65:40018). May 27 03:51:46.134321 sshd[3067]: Accepted publickey for core from 139.178.89.65 port 40018 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 03:51:46.135493 sshd-session[3067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:51:46.138270 systemd-logind[2785]: New session 9 of user core. May 27 03:51:46.160826 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 03:51:46.370593 sudo[3070]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 03:51:46.370864 sudo[3070]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:51:46.668299 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 03:51:46.694027 (dockerd)[3098]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 03:51:46.908836 dockerd[3098]: time="2025-05-27T03:51:46.908789061Z" level=info msg="Starting up" May 27 03:51:46.909921 dockerd[3098]: time="2025-05-27T03:51:46.909900741Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 03:51:46.936264 dockerd[3098]: time="2025-05-27T03:51:46.936210581Z" level=info msg="Loading containers: start." May 27 03:51:46.948677 kernel: Initializing XFRM netlink socket May 27 03:51:47.155589 systemd-networkd[2575]: docker0: Link UP May 27 03:51:47.156344 dockerd[3098]: time="2025-05-27T03:51:47.156316901Z" level=info msg="Loading containers: done." May 27 03:51:47.165260 dockerd[3098]: time="2025-05-27T03:51:47.165231501Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 03:51:47.165314 dockerd[3098]: time="2025-05-27T03:51:47.165294541Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 03:51:47.165394 dockerd[3098]: time="2025-05-27T03:51:47.165382701Z" level=info msg="Initializing buildkit" May 27 03:51:47.179270 dockerd[3098]: time="2025-05-27T03:51:47.179249141Z" level=info msg="Completed buildkit initialization" May 27 03:51:47.183987 dockerd[3098]: time="2025-05-27T03:51:47.183966741Z" level=info msg="Daemon has completed initialization" May 27 03:51:47.184043 dockerd[3098]: time="2025-05-27T03:51:47.184012021Z" level=info msg="API listen on /run/docker.sock" May 27 03:51:47.184115 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 03:51:47.734994 containerd[2800]: time="2025-05-27T03:51:47.734962141Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\"" May 27 03:51:47.925895 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3857392493-merged.mount: Deactivated successfully. May 27 03:51:48.970862 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount757845064.mount: Deactivated successfully. May 27 03:51:50.127465 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 03:51:50.128935 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:51:50.194999 containerd[2800]: time="2025-05-27T03:51:50.194956421Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:50.195217 containerd[2800]: time="2025-05-27T03:51:50.194983901Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.5: active requests=0, bytes read=26326311" May 27 03:51:50.195896 containerd[2800]: time="2025-05-27T03:51:50.195872381Z" level=info msg="ImageCreate event name:\"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:50.199591 containerd[2800]: time="2025-05-27T03:51:50.199555101Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:50.200451 containerd[2800]: time="2025-05-27T03:51:50.200436421Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.5\" with image id \"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\", size \"26323111\" in 2.46543804s" May 27 03:51:50.200496 containerd[2800]: time="2025-05-27T03:51:50.200485021Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\" returns image reference \"sha256:42968274c3d27c41cdc146f5442f122c1c74960e299c13e2f348d2fe835a9134\"" May 27 03:51:50.201072 containerd[2800]: time="2025-05-27T03:51:50.201053541Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\"" May 27 03:51:50.260237 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:51:50.263474 (kubelet)[3431]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:51:50.294215 kubelet[3431]: E0527 03:51:50.294187 3431 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:51:50.297271 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:51:50.297393 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:51:50.297689 systemd[1]: kubelet.service: Consumed 148ms CPU time, 116.2M memory peak. May 27 03:51:51.323515 containerd[2800]: time="2025-05-27T03:51:51.323314621Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:51.323827 containerd[2800]: time="2025-05-27T03:51:51.323606061Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.5: active requests=0, bytes read=22530547" May 27 03:51:51.325721 containerd[2800]: time="2025-05-27T03:51:51.325693741Z" level=info msg="ImageCreate event name:\"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:51.328079 containerd[2800]: time="2025-05-27T03:51:51.328054261Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:51.329068 containerd[2800]: time="2025-05-27T03:51:51.329045461Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.5\" with image id \"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\", size \"24066313\" in 1.12796488s" May 27 03:51:51.329093 containerd[2800]: time="2025-05-27T03:51:51.329076021Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\" returns image reference \"sha256:82042044d6ea1f1e5afda9c7351883800adbde447314786c4e5a2fd9e42aab09\"" May 27 03:51:51.329485 containerd[2800]: time="2025-05-27T03:51:51.329461821Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\"" May 27 03:51:52.329560 containerd[2800]: time="2025-05-27T03:51:52.329526461Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:52.329560 containerd[2800]: time="2025-05-27T03:51:52.329549981Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.5: active requests=0, bytes read=17484190" May 27 03:51:52.330443 containerd[2800]: time="2025-05-27T03:51:52.330419261Z" level=info msg="ImageCreate event name:\"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:52.332661 containerd[2800]: time="2025-05-27T03:51:52.332641301Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:52.333610 containerd[2800]: time="2025-05-27T03:51:52.333585221Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.5\" with image id \"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\", size \"19019974\" in 1.00409112s" May 27 03:51:52.333659 containerd[2800]: time="2025-05-27T03:51:52.333614461Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\" returns image reference \"sha256:e149336437f90109dad736c8a42e4b73c137a66579be8f3b9a456bcc62af3f9b\"" May 27 03:51:52.333918 containerd[2800]: time="2025-05-27T03:51:52.333895821Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\"" May 27 03:51:53.122087 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4199619627.mount: Deactivated successfully. May 27 03:51:53.305818 containerd[2800]: time="2025-05-27T03:51:53.305773421Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:53.305926 containerd[2800]: time="2025-05-27T03:51:53.305789861Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.5: active requests=0, bytes read=27377375" May 27 03:51:53.306453 containerd[2800]: time="2025-05-27T03:51:53.306433101Z" level=info msg="ImageCreate event name:\"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:53.307875 containerd[2800]: time="2025-05-27T03:51:53.307856061Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:53.308471 containerd[2800]: time="2025-05-27T03:51:53.308445501Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.5\" with image id \"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\", repo tag \"registry.k8s.io/kube-proxy:v1.32.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\", size \"27376394\" in 974.52112ms" May 27 03:51:53.308497 containerd[2800]: time="2025-05-27T03:51:53.308475461Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\" returns image reference \"sha256:69b7afc06f22edcae3b6a7d80cdacb488a5415fd605e89534679e5ebc41375fc\"" May 27 03:51:53.308878 containerd[2800]: time="2025-05-27T03:51:53.308853981Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 27 03:51:53.720036 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount394341703.mount: Deactivated successfully. May 27 03:51:54.389040 containerd[2800]: time="2025-05-27T03:51:54.389001661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:54.389343 containerd[2800]: time="2025-05-27T03:51:54.389004741Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" May 27 03:51:54.389921 containerd[2800]: time="2025-05-27T03:51:54.389900261Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:54.392545 containerd[2800]: time="2025-05-27T03:51:54.392520661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:54.393553 containerd[2800]: time="2025-05-27T03:51:54.393525981Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.0846392s" May 27 03:51:54.393596 containerd[2800]: time="2025-05-27T03:51:54.393558581Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" May 27 03:51:54.393993 containerd[2800]: time="2025-05-27T03:51:54.393966781Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 03:51:54.674352 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount468749948.mount: Deactivated successfully. May 27 03:51:54.675377 containerd[2800]: time="2025-05-27T03:51:54.675282261Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:51:54.675377 containerd[2800]: time="2025-05-27T03:51:54.675365141Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" May 27 03:51:54.676011 containerd[2800]: time="2025-05-27T03:51:54.675989141Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:51:54.677616 containerd[2800]: time="2025-05-27T03:51:54.677589181Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:51:54.678575 containerd[2800]: time="2025-05-27T03:51:54.678554781Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 284.5562ms" May 27 03:51:54.678599 containerd[2800]: time="2025-05-27T03:51:54.678579821Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 27 03:51:54.679129 containerd[2800]: time="2025-05-27T03:51:54.678937421Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 27 03:51:54.975773 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount49799264.mount: Deactivated successfully. May 27 03:51:57.348057 containerd[2800]: time="2025-05-27T03:51:57.348013101Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:57.348431 containerd[2800]: time="2025-05-27T03:51:57.348048021Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67812469" May 27 03:51:57.349048 containerd[2800]: time="2025-05-27T03:51:57.349026101Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:57.351723 containerd[2800]: time="2025-05-27T03:51:57.351699101Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:51:57.352769 containerd[2800]: time="2025-05-27T03:51:57.352745821Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.6737772s" May 27 03:51:57.352825 containerd[2800]: time="2025-05-27T03:51:57.352773941Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" May 27 03:52:00.503337 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 03:52:00.505292 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:52:00.642356 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:52:00.645683 (kubelet)[3668]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:52:00.690002 kubelet[3668]: E0527 03:52:00.689967 3668 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:52:00.692522 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:52:00.692644 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:52:00.694738 systemd[1]: kubelet.service: Consumed 142ms CPU time, 118.9M memory peak. May 27 03:52:02.937922 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:52:02.938380 systemd[1]: kubelet.service: Consumed 142ms CPU time, 118.9M memory peak. May 27 03:52:02.940994 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:52:02.959610 systemd[1]: Reload requested from client PID 3697 ('systemctl') (unit session-9.scope)... May 27 03:52:02.959619 systemd[1]: Reloading... May 27 03:52:03.020680 zram_generator::config[3744]: No configuration found. May 27 03:52:03.096463 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:52:03.199707 systemd[1]: Reloading finished in 239 ms. May 27 03:52:03.253278 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 03:52:03.253363 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 03:52:03.253630 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:52:03.256876 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:52:03.378565 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:52:03.381860 (kubelet)[3804]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 03:52:03.412408 kubelet[3804]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:52:03.412408 kubelet[3804]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 03:52:03.412408 kubelet[3804]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:52:03.412632 kubelet[3804]: I0527 03:52:03.412465 3804 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 03:52:03.878619 kubelet[3804]: I0527 03:52:03.878586 3804 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 03:52:03.878619 kubelet[3804]: I0527 03:52:03.878612 3804 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 03:52:03.878851 kubelet[3804]: I0527 03:52:03.878838 3804 server.go:954] "Client rotation is on, will bootstrap in background" May 27 03:52:03.901793 kubelet[3804]: E0527 03:52:03.901768 3804 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://147.28.163.138:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 147.28.163.138:6443: connect: connection refused" logger="UnhandledError" May 27 03:52:03.902787 kubelet[3804]: I0527 03:52:03.902768 3804 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 03:52:03.907441 kubelet[3804]: I0527 03:52:03.907425 3804 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 03:52:03.927963 kubelet[3804]: I0527 03:52:03.927933 3804 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 03:52:03.928584 kubelet[3804]: I0527 03:52:03.928547 3804 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 03:52:03.928745 kubelet[3804]: I0527 03:52:03.928586 3804 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-a-636136d453","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 03:52:03.928825 kubelet[3804]: I0527 03:52:03.928820 3804 topology_manager.go:138] "Creating topology manager with none policy" May 27 03:52:03.928851 kubelet[3804]: I0527 03:52:03.928829 3804 container_manager_linux.go:304] "Creating device plugin manager" May 27 03:52:03.929043 kubelet[3804]: I0527 03:52:03.929033 3804 state_mem.go:36] "Initialized new in-memory state store" May 27 03:52:03.931866 kubelet[3804]: I0527 03:52:03.931841 3804 kubelet.go:446] "Attempting to sync node with API server" May 27 03:52:03.931907 kubelet[3804]: I0527 03:52:03.931868 3804 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 03:52:03.931907 kubelet[3804]: I0527 03:52:03.931887 3804 kubelet.go:352] "Adding apiserver pod source" May 27 03:52:03.931907 kubelet[3804]: I0527 03:52:03.931896 3804 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 03:52:03.934824 kubelet[3804]: W0527 03:52:03.934769 3804 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://147.28.163.138:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 147.28.163.138:6443: connect: connection refused May 27 03:52:03.934878 kubelet[3804]: E0527 03:52:03.934857 3804 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://147.28.163.138:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 147.28.163.138:6443: connect: connection refused" logger="UnhandledError" May 27 03:52:03.936097 kubelet[3804]: W0527 03:52:03.936060 3804 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://147.28.163.138:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-636136d453&limit=500&resourceVersion=0": dial tcp 147.28.163.138:6443: connect: connection refused May 27 03:52:03.936121 kubelet[3804]: E0527 03:52:03.936106 3804 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://147.28.163.138:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-636136d453&limit=500&resourceVersion=0\": dial tcp 147.28.163.138:6443: connect: connection refused" logger="UnhandledError" May 27 03:52:03.936252 kubelet[3804]: I0527 03:52:03.936229 3804 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 03:52:03.936807 kubelet[3804]: I0527 03:52:03.936793 3804 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 03:52:03.936907 kubelet[3804]: W0527 03:52:03.936900 3804 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 03:52:03.937745 kubelet[3804]: I0527 03:52:03.937730 3804 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 03:52:03.937767 kubelet[3804]: I0527 03:52:03.937762 3804 server.go:1287] "Started kubelet" May 27 03:52:03.937857 kubelet[3804]: I0527 03:52:03.937821 3804 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 03:52:03.939812 kubelet[3804]: I0527 03:52:03.939790 3804 server.go:479] "Adding debug handlers to kubelet server" May 27 03:52:03.941410 kubelet[3804]: I0527 03:52:03.941392 3804 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 03:52:03.941410 kubelet[3804]: I0527 03:52:03.941397 3804 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 03:52:03.941584 kubelet[3804]: E0527 03:52:03.941566 3804 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-636136d453\" not found" May 27 03:52:03.941606 kubelet[3804]: I0527 03:52:03.941573 3804 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 03:52:03.941623 kubelet[3804]: I0527 03:52:03.941586 3804 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 03:52:03.941701 kubelet[3804]: I0527 03:52:03.941689 3804 reconciler.go:26] "Reconciler: start to sync state" May 27 03:52:03.941760 kubelet[3804]: I0527 03:52:03.941711 3804 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 03:52:03.941787 kubelet[3804]: E0527 03:52:03.941771 3804 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 03:52:03.941899 kubelet[3804]: W0527 03:52:03.941863 3804 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://147.28.163.138:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 147.28.163.138:6443: connect: connection refused May 27 03:52:03.941926 kubelet[3804]: E0527 03:52:03.941905 3804 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://147.28.163.138:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 147.28.163.138:6443: connect: connection refused" logger="UnhandledError" May 27 03:52:03.941949 kubelet[3804]: I0527 03:52:03.941933 3804 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 03:52:03.942041 kubelet[3804]: I0527 03:52:03.942027 3804 factory.go:221] Registration of the systemd container factory successfully May 27 03:52:03.942130 kubelet[3804]: I0527 03:52:03.942117 3804 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 03:52:03.942184 kubelet[3804]: E0527 03:52:03.942163 3804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.163.138:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-636136d453?timeout=10s\": dial tcp 147.28.163.138:6443: connect: connection refused" interval="200ms" May 27 03:52:03.942807 kubelet[3804]: I0527 03:52:03.942792 3804 factory.go:221] Registration of the containerd container factory successfully May 27 03:52:03.942919 kubelet[3804]: E0527 03:52:03.942702 3804 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://147.28.163.138:6443/api/v1/namespaces/default/events\": dial tcp 147.28.163.138:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344.0.0-a-636136d453.184345e38ac1ee55 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344.0.0-a-636136d453,UID:ci-4344.0.0-a-636136d453,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344.0.0-a-636136d453,},FirstTimestamp:2025-05-27 03:52:03.937742421 +0000 UTC m=+0.553088761,LastTimestamp:2025-05-27 03:52:03.937742421 +0000 UTC m=+0.553088761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344.0.0-a-636136d453,}" May 27 03:52:03.955729 kubelet[3804]: I0527 03:52:03.955689 3804 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 03:52:03.956705 kubelet[3804]: I0527 03:52:03.956691 3804 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 03:52:03.956733 kubelet[3804]: I0527 03:52:03.956713 3804 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 03:52:03.956733 kubelet[3804]: I0527 03:52:03.956729 3804 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 03:52:03.956770 kubelet[3804]: I0527 03:52:03.956736 3804 kubelet.go:2382] "Starting kubelet main sync loop" May 27 03:52:03.956786 kubelet[3804]: E0527 03:52:03.956772 3804 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 03:52:03.957015 kubelet[3804]: I0527 03:52:03.957001 3804 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 03:52:03.957036 kubelet[3804]: I0527 03:52:03.957016 3804 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 03:52:03.957036 kubelet[3804]: I0527 03:52:03.957033 3804 state_mem.go:36] "Initialized new in-memory state store" May 27 03:52:03.957135 kubelet[3804]: W0527 03:52:03.957101 3804 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://147.28.163.138:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 147.28.163.138:6443: connect: connection refused May 27 03:52:03.957165 kubelet[3804]: E0527 03:52:03.957150 3804 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://147.28.163.138:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 147.28.163.138:6443: connect: connection refused" logger="UnhandledError" May 27 03:52:03.957620 kubelet[3804]: I0527 03:52:03.957608 3804 policy_none.go:49] "None policy: Start" May 27 03:52:03.957638 kubelet[3804]: I0527 03:52:03.957625 3804 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 03:52:03.957638 kubelet[3804]: I0527 03:52:03.957635 3804 state_mem.go:35] "Initializing new in-memory state store" May 27 03:52:03.961760 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 03:52:03.975867 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 03:52:03.978413 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 03:52:03.994314 kubelet[3804]: I0527 03:52:03.994294 3804 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 03:52:03.994493 kubelet[3804]: I0527 03:52:03.994479 3804 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 03:52:03.994524 kubelet[3804]: I0527 03:52:03.994492 3804 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 03:52:03.994658 kubelet[3804]: I0527 03:52:03.994641 3804 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 03:52:03.995123 kubelet[3804]: E0527 03:52:03.995109 3804 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 03:52:03.995150 kubelet[3804]: E0527 03:52:03.995143 3804 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344.0.0-a-636136d453\" not found" May 27 03:52:04.063822 systemd[1]: Created slice kubepods-burstable-pod64eeadc5624e0fc6d66db7d77f616932.slice - libcontainer container kubepods-burstable-pod64eeadc5624e0fc6d66db7d77f616932.slice. May 27 03:52:04.097171 kubelet[3804]: I0527 03:52:04.097141 3804 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.097531 kubelet[3804]: E0527 03:52:04.097509 3804 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://147.28.163.138:6443/api/v1/nodes\": dial tcp 147.28.163.138:6443: connect: connection refused" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.100005 kubelet[3804]: E0527 03:52:04.099988 3804 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.102672 systemd[1]: Created slice kubepods-burstable-pod1bccf1d2d7a6f70cf1e05309180c9478.slice - libcontainer container kubepods-burstable-pod1bccf1d2d7a6f70cf1e05309180c9478.slice. May 27 03:52:04.104021 kubelet[3804]: E0527 03:52:04.104003 3804 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.125030 systemd[1]: Created slice kubepods-burstable-pod3f3b2db9100cde75d750d7e671d9fc72.slice - libcontainer container kubepods-burstable-pod3f3b2db9100cde75d750d7e671d9fc72.slice. May 27 03:52:04.126408 kubelet[3804]: E0527 03:52:04.126391 3804 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.142725 kubelet[3804]: I0527 03:52:04.142639 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/64eeadc5624e0fc6d66db7d77f616932-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-a-636136d453\" (UID: \"64eeadc5624e0fc6d66db7d77f616932\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:04.142798 kubelet[3804]: E0527 03:52:04.142772 3804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.163.138:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-636136d453?timeout=10s\": dial tcp 147.28.163.138:6443: connect: connection refused" interval="400ms" May 27 03:52:04.243058 kubelet[3804]: I0527 03:52:04.243033 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/64eeadc5624e0fc6d66db7d77f616932-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-a-636136d453\" (UID: \"64eeadc5624e0fc6d66db7d77f616932\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:04.243111 kubelet[3804]: I0527 03:52:04.243061 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:04.243111 kubelet[3804]: I0527 03:52:04.243079 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/64eeadc5624e0fc6d66db7d77f616932-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-a-636136d453\" (UID: \"64eeadc5624e0fc6d66db7d77f616932\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:04.243111 kubelet[3804]: I0527 03:52:04.243093 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:04.243111 kubelet[3804]: I0527 03:52:04.243110 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:04.243248 kubelet[3804]: I0527 03:52:04.243124 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:04.243248 kubelet[3804]: I0527 03:52:04.243163 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:04.243248 kubelet[3804]: I0527 03:52:04.243213 3804 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3f3b2db9100cde75d750d7e671d9fc72-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-a-636136d453\" (UID: \"3f3b2db9100cde75d750d7e671d9fc72\") " pod="kube-system/kube-scheduler-ci-4344.0.0-a-636136d453" May 27 03:52:04.299470 kubelet[3804]: I0527 03:52:04.299448 3804 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.299745 kubelet[3804]: E0527 03:52:04.299715 3804 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://147.28.163.138:6443/api/v1/nodes\": dial tcp 147.28.163.138:6443: connect: connection refused" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.401232 containerd[2800]: time="2025-05-27T03:52:04.401183221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-a-636136d453,Uid:64eeadc5624e0fc6d66db7d77f616932,Namespace:kube-system,Attempt:0,}" May 27 03:52:04.404536 containerd[2800]: time="2025-05-27T03:52:04.404512301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-a-636136d453,Uid:1bccf1d2d7a6f70cf1e05309180c9478,Namespace:kube-system,Attempt:0,}" May 27 03:52:04.411710 containerd[2800]: time="2025-05-27T03:52:04.411684781Z" level=info msg="connecting to shim 04a7676f4bd5214b7838e823b18694233196ad5e1b2e24ed80d01e123ffe0bf6" address="unix:///run/containerd/s/27be8e4082c06ca14c095de5d9a4a7bda9e005bdab51d142ba52e37a7be654cb" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:04.413298 containerd[2800]: time="2025-05-27T03:52:04.413270021Z" level=info msg="connecting to shim 22024c10b6713d811eafc1e87913906e222c763078d06b9b01bdd6999b99d095" address="unix:///run/containerd/s/543da9b61cd72956c315c588b1394fdf9acb32c0dc51902bdf18e808359aa13e" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:04.427001 containerd[2800]: time="2025-05-27T03:52:04.426971941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-a-636136d453,Uid:3f3b2db9100cde75d750d7e671d9fc72,Namespace:kube-system,Attempt:0,}" May 27 03:52:04.435857 systemd[1]: Started cri-containerd-04a7676f4bd5214b7838e823b18694233196ad5e1b2e24ed80d01e123ffe0bf6.scope - libcontainer container 04a7676f4bd5214b7838e823b18694233196ad5e1b2e24ed80d01e123ffe0bf6. May 27 03:52:04.436908 containerd[2800]: time="2025-05-27T03:52:04.436872781Z" level=info msg="connecting to shim e7558eecff9cb8903c40f0eded02a30a250165276107929e7cc723f8b498b486" address="unix:///run/containerd/s/bf34443d2873fc045e8119d654d167bc25e9e63555cb21856604ae80f914c1c3" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:04.437227 systemd[1]: Started cri-containerd-22024c10b6713d811eafc1e87913906e222c763078d06b9b01bdd6999b99d095.scope - libcontainer container 22024c10b6713d811eafc1e87913906e222c763078d06b9b01bdd6999b99d095. May 27 03:52:04.449565 systemd[1]: Started cri-containerd-e7558eecff9cb8903c40f0eded02a30a250165276107929e7cc723f8b498b486.scope - libcontainer container e7558eecff9cb8903c40f0eded02a30a250165276107929e7cc723f8b498b486. May 27 03:52:04.462193 containerd[2800]: time="2025-05-27T03:52:04.462163661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-a-636136d453,Uid:64eeadc5624e0fc6d66db7d77f616932,Namespace:kube-system,Attempt:0,} returns sandbox id \"04a7676f4bd5214b7838e823b18694233196ad5e1b2e24ed80d01e123ffe0bf6\"" May 27 03:52:04.462854 containerd[2800]: time="2025-05-27T03:52:04.462833421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-a-636136d453,Uid:1bccf1d2d7a6f70cf1e05309180c9478,Namespace:kube-system,Attempt:0,} returns sandbox id \"22024c10b6713d811eafc1e87913906e222c763078d06b9b01bdd6999b99d095\"" May 27 03:52:04.465405 containerd[2800]: time="2025-05-27T03:52:04.465381861Z" level=info msg="CreateContainer within sandbox \"04a7676f4bd5214b7838e823b18694233196ad5e1b2e24ed80d01e123ffe0bf6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 03:52:04.465680 containerd[2800]: time="2025-05-27T03:52:04.465648381Z" level=info msg="CreateContainer within sandbox \"22024c10b6713d811eafc1e87913906e222c763078d06b9b01bdd6999b99d095\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 03:52:04.469519 containerd[2800]: time="2025-05-27T03:52:04.469493061Z" level=info msg="Container e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:04.470352 containerd[2800]: time="2025-05-27T03:52:04.470330101Z" level=info msg="Container 795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:04.473100 containerd[2800]: time="2025-05-27T03:52:04.473071861Z" level=info msg="CreateContainer within sandbox \"22024c10b6713d811eafc1e87913906e222c763078d06b9b01bdd6999b99d095\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0\"" May 27 03:52:04.473244 containerd[2800]: time="2025-05-27T03:52:04.473221141Z" level=info msg="CreateContainer within sandbox \"04a7676f4bd5214b7838e823b18694233196ad5e1b2e24ed80d01e123ffe0bf6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa\"" May 27 03:52:04.473517 containerd[2800]: time="2025-05-27T03:52:04.473500421Z" level=info msg="StartContainer for \"e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0\"" May 27 03:52:04.473595 containerd[2800]: time="2025-05-27T03:52:04.473512661Z" level=info msg="StartContainer for \"795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa\"" May 27 03:52:04.474539 containerd[2800]: time="2025-05-27T03:52:04.474516541Z" level=info msg="connecting to shim e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0" address="unix:///run/containerd/s/543da9b61cd72956c315c588b1394fdf9acb32c0dc51902bdf18e808359aa13e" protocol=ttrpc version=3 May 27 03:52:04.474580 containerd[2800]: time="2025-05-27T03:52:04.474557261Z" level=info msg="connecting to shim 795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa" address="unix:///run/containerd/s/27be8e4082c06ca14c095de5d9a4a7bda9e005bdab51d142ba52e37a7be654cb" protocol=ttrpc version=3 May 27 03:52:04.475119 containerd[2800]: time="2025-05-27T03:52:04.475104501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-a-636136d453,Uid:3f3b2db9100cde75d750d7e671d9fc72,Namespace:kube-system,Attempt:0,} returns sandbox id \"e7558eecff9cb8903c40f0eded02a30a250165276107929e7cc723f8b498b486\"" May 27 03:52:04.476631 containerd[2800]: time="2025-05-27T03:52:04.476610821Z" level=info msg="CreateContainer within sandbox \"e7558eecff9cb8903c40f0eded02a30a250165276107929e7cc723f8b498b486\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 03:52:04.480183 containerd[2800]: time="2025-05-27T03:52:04.480162061Z" level=info msg="Container 5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:04.482883 containerd[2800]: time="2025-05-27T03:52:04.482853541Z" level=info msg="CreateContainer within sandbox \"e7558eecff9cb8903c40f0eded02a30a250165276107929e7cc723f8b498b486\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606\"" May 27 03:52:04.483173 containerd[2800]: time="2025-05-27T03:52:04.483149941Z" level=info msg="StartContainer for \"5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606\"" May 27 03:52:04.484110 containerd[2800]: time="2025-05-27T03:52:04.484088701Z" level=info msg="connecting to shim 5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606" address="unix:///run/containerd/s/bf34443d2873fc045e8119d654d167bc25e9e63555cb21856604ae80f914c1c3" protocol=ttrpc version=3 May 27 03:52:04.500853 systemd[1]: Started cri-containerd-795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa.scope - libcontainer container 795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa. May 27 03:52:04.502002 systemd[1]: Started cri-containerd-e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0.scope - libcontainer container e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0. May 27 03:52:04.504279 systemd[1]: Started cri-containerd-5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606.scope - libcontainer container 5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606. May 27 03:52:04.530659 containerd[2800]: time="2025-05-27T03:52:04.530568741Z" level=info msg="StartContainer for \"795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa\" returns successfully" May 27 03:52:04.531414 containerd[2800]: time="2025-05-27T03:52:04.531378181Z" level=info msg="StartContainer for \"e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0\" returns successfully" May 27 03:52:04.533558 containerd[2800]: time="2025-05-27T03:52:04.533539381Z" level=info msg="StartContainer for \"5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606\" returns successfully" May 27 03:52:04.543256 kubelet[3804]: E0527 03:52:04.543223 3804 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.163.138:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-636136d453?timeout=10s\": dial tcp 147.28.163.138:6443: connect: connection refused" interval="800ms" May 27 03:52:04.702696 kubelet[3804]: I0527 03:52:04.702572 3804 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.961879 kubelet[3804]: E0527 03:52:04.961841 3804 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.962815 kubelet[3804]: E0527 03:52:04.962795 3804 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:04.963884 kubelet[3804]: E0527 03:52:04.963865 3804 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:05.909029 kubelet[3804]: E0527 03:52:05.908993 3804 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:05.932973 kubelet[3804]: I0527 03:52:05.932956 3804 apiserver.go:52] "Watching apiserver" May 27 03:52:05.942082 kubelet[3804]: I0527 03:52:05.942060 3804 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 03:52:05.965688 kubelet[3804]: E0527 03:52:05.965660 3804 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:05.965984 kubelet[3804]: E0527 03:52:05.965967 3804 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-636136d453\" not found" node="ci-4344.0.0-a-636136d453" May 27 03:52:06.014179 kubelet[3804]: I0527 03:52:06.014140 3804 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344.0.0-a-636136d453" May 27 03:52:06.042464 kubelet[3804]: I0527 03:52:06.042442 3804 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:06.047302 kubelet[3804]: E0527 03:52:06.047277 3804 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344.0.0-a-636136d453\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:06.047302 kubelet[3804]: I0527 03:52:06.047296 3804 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-636136d453" May 27 03:52:06.048593 kubelet[3804]: E0527 03:52:06.048578 3804 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344.0.0-a-636136d453\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344.0.0-a-636136d453" May 27 03:52:06.048593 kubelet[3804]: I0527 03:52:06.048591 3804 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:06.050020 kubelet[3804]: E0527 03:52:06.049999 3804 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344.0.0-a-636136d453\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:07.814911 systemd[1]: Reload requested from client PID 4219 ('systemctl') (unit session-9.scope)... May 27 03:52:07.814921 systemd[1]: Reloading... May 27 03:52:07.878683 zram_generator::config[4270]: No configuration found. May 27 03:52:07.953736 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:52:08.067106 systemd[1]: Reloading finished in 251 ms. May 27 03:52:08.089702 kubelet[3804]: I0527 03:52:08.089613 3804 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 03:52:08.089759 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:52:08.101477 systemd[1]: kubelet.service: Deactivated successfully. May 27 03:52:08.101776 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:52:08.101826 systemd[1]: kubelet.service: Consumed 1.020s CPU time, 144.8M memory peak. May 27 03:52:08.103571 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:52:08.260811 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:52:08.264299 (kubelet)[4330]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 03:52:08.304593 kubelet[4330]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:52:08.304593 kubelet[4330]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 03:52:08.304593 kubelet[4330]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:52:08.304777 kubelet[4330]: I0527 03:52:08.304650 4330 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 03:52:08.309775 kubelet[4330]: I0527 03:52:08.309754 4330 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 03:52:08.309803 kubelet[4330]: I0527 03:52:08.309776 4330 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 03:52:08.310036 kubelet[4330]: I0527 03:52:08.310026 4330 server.go:954] "Client rotation is on, will bootstrap in background" May 27 03:52:08.311213 kubelet[4330]: I0527 03:52:08.311200 4330 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 27 03:52:08.313937 kubelet[4330]: I0527 03:52:08.313923 4330 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 03:52:08.318002 kubelet[4330]: I0527 03:52:08.317957 4330 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 03:52:08.338008 kubelet[4330]: I0527 03:52:08.337983 4330 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 03:52:08.338197 kubelet[4330]: I0527 03:52:08.338176 4330 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 03:52:08.338356 kubelet[4330]: I0527 03:52:08.338199 4330 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-a-636136d453","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 03:52:08.338425 kubelet[4330]: I0527 03:52:08.338366 4330 topology_manager.go:138] "Creating topology manager with none policy" May 27 03:52:08.338425 kubelet[4330]: I0527 03:52:08.338375 4330 container_manager_linux.go:304] "Creating device plugin manager" May 27 03:52:08.338469 kubelet[4330]: I0527 03:52:08.338442 4330 state_mem.go:36] "Initialized new in-memory state store" May 27 03:52:08.338812 kubelet[4330]: I0527 03:52:08.338798 4330 kubelet.go:446] "Attempting to sync node with API server" May 27 03:52:08.338841 kubelet[4330]: I0527 03:52:08.338813 4330 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 03:52:08.338841 kubelet[4330]: I0527 03:52:08.338835 4330 kubelet.go:352] "Adding apiserver pod source" May 27 03:52:08.338877 kubelet[4330]: I0527 03:52:08.338859 4330 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 03:52:08.339510 kubelet[4330]: I0527 03:52:08.339490 4330 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 03:52:08.339953 kubelet[4330]: I0527 03:52:08.339940 4330 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 03:52:08.340330 kubelet[4330]: I0527 03:52:08.340318 4330 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 03:52:08.340356 kubelet[4330]: I0527 03:52:08.340350 4330 server.go:1287] "Started kubelet" May 27 03:52:08.340497 kubelet[4330]: I0527 03:52:08.340405 4330 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 03:52:08.340520 kubelet[4330]: I0527 03:52:08.340476 4330 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 03:52:08.340686 kubelet[4330]: I0527 03:52:08.340673 4330 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 03:52:08.341400 kubelet[4330]: I0527 03:52:08.341382 4330 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 03:52:08.341490 kubelet[4330]: E0527 03:52:08.341472 4330 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-636136d453\" not found" May 27 03:52:08.341490 kubelet[4330]: I0527 03:52:08.341477 4330 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 03:52:08.341530 kubelet[4330]: I0527 03:52:08.341502 4330 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 03:52:08.341600 kubelet[4330]: I0527 03:52:08.341578 4330 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 03:52:08.341659 kubelet[4330]: I0527 03:52:08.341645 4330 reconciler.go:26] "Reconciler: start to sync state" May 27 03:52:08.341906 kubelet[4330]: E0527 03:52:08.341889 4330 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 03:52:08.342480 kubelet[4330]: I0527 03:52:08.342464 4330 server.go:479] "Adding debug handlers to kubelet server" May 27 03:52:08.342727 kubelet[4330]: I0527 03:52:08.342709 4330 factory.go:221] Registration of the containerd container factory successfully May 27 03:52:08.342727 kubelet[4330]: I0527 03:52:08.342724 4330 factory.go:221] Registration of the systemd container factory successfully May 27 03:52:08.342833 kubelet[4330]: I0527 03:52:08.342815 4330 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 03:52:08.349213 kubelet[4330]: I0527 03:52:08.349180 4330 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 03:52:08.350177 kubelet[4330]: I0527 03:52:08.350162 4330 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 03:52:08.350198 kubelet[4330]: I0527 03:52:08.350181 4330 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 03:52:08.350216 kubelet[4330]: I0527 03:52:08.350199 4330 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 03:52:08.350216 kubelet[4330]: I0527 03:52:08.350207 4330 kubelet.go:2382] "Starting kubelet main sync loop" May 27 03:52:08.350266 kubelet[4330]: E0527 03:52:08.350247 4330 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 03:52:08.373544 kubelet[4330]: I0527 03:52:08.373517 4330 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 03:52:08.373544 kubelet[4330]: I0527 03:52:08.373536 4330 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 03:52:08.373622 kubelet[4330]: I0527 03:52:08.373556 4330 state_mem.go:36] "Initialized new in-memory state store" May 27 03:52:08.373716 kubelet[4330]: I0527 03:52:08.373703 4330 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 03:52:08.373738 kubelet[4330]: I0527 03:52:08.373714 4330 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 03:52:08.373738 kubelet[4330]: I0527 03:52:08.373733 4330 policy_none.go:49] "None policy: Start" May 27 03:52:08.373780 kubelet[4330]: I0527 03:52:08.373741 4330 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 03:52:08.373780 kubelet[4330]: I0527 03:52:08.373752 4330 state_mem.go:35] "Initializing new in-memory state store" May 27 03:52:08.373851 kubelet[4330]: I0527 03:52:08.373843 4330 state_mem.go:75] "Updated machine memory state" May 27 03:52:08.376864 kubelet[4330]: I0527 03:52:08.376851 4330 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 03:52:08.377028 kubelet[4330]: I0527 03:52:08.377011 4330 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 03:52:08.377071 kubelet[4330]: I0527 03:52:08.377023 4330 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 03:52:08.377204 kubelet[4330]: I0527 03:52:08.377189 4330 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 03:52:08.377634 kubelet[4330]: E0527 03:52:08.377606 4330 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 03:52:08.451226 kubelet[4330]: I0527 03:52:08.451195 4330 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:08.451294 kubelet[4330]: I0527 03:52:08.451208 4330 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-636136d453" May 27 03:52:08.451338 kubelet[4330]: I0527 03:52:08.451323 4330 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:08.454096 kubelet[4330]: W0527 03:52:08.454077 4330 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:52:08.454221 kubelet[4330]: W0527 03:52:08.454203 4330 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:52:08.454221 kubelet[4330]: W0527 03:52:08.454215 4330 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:52:08.479949 kubelet[4330]: I0527 03:52:08.479931 4330 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-636136d453" May 27 03:52:08.484415 kubelet[4330]: I0527 03:52:08.484394 4330 kubelet_node_status.go:124] "Node was previously registered" node="ci-4344.0.0-a-636136d453" May 27 03:52:08.484454 kubelet[4330]: I0527 03:52:08.484450 4330 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344.0.0-a-636136d453" May 27 03:52:08.542959 kubelet[4330]: I0527 03:52:08.542932 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:08.543008 kubelet[4330]: I0527 03:52:08.542963 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3f3b2db9100cde75d750d7e671d9fc72-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-a-636136d453\" (UID: \"3f3b2db9100cde75d750d7e671d9fc72\") " pod="kube-system/kube-scheduler-ci-4344.0.0-a-636136d453" May 27 03:52:08.543008 kubelet[4330]: I0527 03:52:08.542983 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/64eeadc5624e0fc6d66db7d77f616932-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-a-636136d453\" (UID: \"64eeadc5624e0fc6d66db7d77f616932\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:08.543008 kubelet[4330]: I0527 03:52:08.542998 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/64eeadc5624e0fc6d66db7d77f616932-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-a-636136d453\" (UID: \"64eeadc5624e0fc6d66db7d77f616932\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:08.543121 kubelet[4330]: I0527 03:52:08.543025 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:08.543121 kubelet[4330]: I0527 03:52:08.543046 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:08.543121 kubelet[4330]: I0527 03:52:08.543107 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:08.543236 kubelet[4330]: I0527 03:52:08.543152 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1bccf1d2d7a6f70cf1e05309180c9478-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-a-636136d453\" (UID: \"1bccf1d2d7a6f70cf1e05309180c9478\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" May 27 03:52:08.543236 kubelet[4330]: I0527 03:52:08.543181 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/64eeadc5624e0fc6d66db7d77f616932-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-a-636136d453\" (UID: \"64eeadc5624e0fc6d66db7d77f616932\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:09.339941 kubelet[4330]: I0527 03:52:09.339892 4330 apiserver.go:52] "Watching apiserver" May 27 03:52:09.342112 kubelet[4330]: I0527 03:52:09.342093 4330 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 03:52:09.357386 kubelet[4330]: I0527 03:52:09.357372 4330 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:09.357467 kubelet[4330]: I0527 03:52:09.357451 4330 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-636136d453" May 27 03:52:09.360376 kubelet[4330]: W0527 03:52:09.360352 4330 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:52:09.360419 kubelet[4330]: E0527 03:52:09.360408 4330 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344.0.0-a-636136d453\" already exists" pod="kube-system/kube-scheduler-ci-4344.0.0-a-636136d453" May 27 03:52:09.360470 kubelet[4330]: W0527 03:52:09.360451 4330 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:52:09.360519 kubelet[4330]: E0527 03:52:09.360499 4330 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344.0.0-a-636136d453\" already exists" pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" May 27 03:52:09.382844 kubelet[4330]: I0527 03:52:09.382784 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-636136d453" podStartSLOduration=1.382769581 podStartE2EDuration="1.382769581s" podCreationTimestamp="2025-05-27 03:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:52:09.382396141 +0000 UTC m=+1.115234361" watchObservedRunningTime="2025-05-27 03:52:09.382769581 +0000 UTC m=+1.115607761" May 27 03:52:09.393252 kubelet[4330]: I0527 03:52:09.393212 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344.0.0-a-636136d453" podStartSLOduration=1.393199101 podStartE2EDuration="1.393199101s" podCreationTimestamp="2025-05-27 03:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:52:09.388416821 +0000 UTC m=+1.121255041" watchObservedRunningTime="2025-05-27 03:52:09.393199101 +0000 UTC m=+1.126037321" May 27 03:52:09.399419 kubelet[4330]: I0527 03:52:09.399383 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344.0.0-a-636136d453" podStartSLOduration=1.399371781 podStartE2EDuration="1.399371781s" podCreationTimestamp="2025-05-27 03:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:52:09.393291501 +0000 UTC m=+1.126129761" watchObservedRunningTime="2025-05-27 03:52:09.399371781 +0000 UTC m=+1.132210001" May 27 03:52:13.709448 kubelet[4330]: I0527 03:52:13.709396 4330 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 03:52:13.709881 containerd[2800]: time="2025-05-27T03:52:13.709726657Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 03:52:13.710050 kubelet[4330]: I0527 03:52:13.709878 4330 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 03:52:14.880488 systemd[1]: Created slice kubepods-besteffort-podce0aa6f9_8f5d_42b7_86dc_a7cd89a8f3f8.slice - libcontainer container kubepods-besteffort-podce0aa6f9_8f5d_42b7_86dc_a7cd89a8f3f8.slice. May 27 03:52:14.977595 kubelet[4330]: I0527 03:52:14.977555 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8-lib-modules\") pod \"kube-proxy-4s89x\" (UID: \"ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8\") " pod="kube-system/kube-proxy-4s89x" May 27 03:52:14.977888 kubelet[4330]: I0527 03:52:14.977602 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8-kube-proxy\") pod \"kube-proxy-4s89x\" (UID: \"ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8\") " pod="kube-system/kube-proxy-4s89x" May 27 03:52:14.977888 kubelet[4330]: I0527 03:52:14.977647 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8-xtables-lock\") pod \"kube-proxy-4s89x\" (UID: \"ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8\") " pod="kube-system/kube-proxy-4s89x" May 27 03:52:14.977888 kubelet[4330]: I0527 03:52:14.977695 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fp6z\" (UniqueName: \"kubernetes.io/projected/ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8-kube-api-access-2fp6z\") pod \"kube-proxy-4s89x\" (UID: \"ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8\") " pod="kube-system/kube-proxy-4s89x" May 27 03:52:14.991877 systemd[1]: Created slice kubepods-besteffort-pod2f56cf2f_b280_4891_a007_525d5fd3f8b2.slice - libcontainer container kubepods-besteffort-pod2f56cf2f_b280_4891_a007_525d5fd3f8b2.slice. May 27 03:52:15.078221 kubelet[4330]: I0527 03:52:15.078185 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct5sh\" (UniqueName: \"kubernetes.io/projected/2f56cf2f-b280-4891-a007-525d5fd3f8b2-kube-api-access-ct5sh\") pod \"tigera-operator-844669ff44-xf9th\" (UID: \"2f56cf2f-b280-4891-a007-525d5fd3f8b2\") " pod="tigera-operator/tigera-operator-844669ff44-xf9th" May 27 03:52:15.078336 kubelet[4330]: I0527 03:52:15.078251 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2f56cf2f-b280-4891-a007-525d5fd3f8b2-var-lib-calico\") pod \"tigera-operator-844669ff44-xf9th\" (UID: \"2f56cf2f-b280-4891-a007-525d5fd3f8b2\") " pod="tigera-operator/tigera-operator-844669ff44-xf9th" May 27 03:52:15.205955 containerd[2800]: time="2025-05-27T03:52:15.205845373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4s89x,Uid:ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8,Namespace:kube-system,Attempt:0,}" May 27 03:52:15.213531 containerd[2800]: time="2025-05-27T03:52:15.213506550Z" level=info msg="connecting to shim 85dec5b53d418ece8ad40adde34e7fa964858ac9fbc69f29fb716033c79e779f" address="unix:///run/containerd/s/bc1bc7cec9ff9cf294ba7f5c37ae32bddf08afc4a046e74886e44c07fbcce8ff" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:15.247793 systemd[1]: Started cri-containerd-85dec5b53d418ece8ad40adde34e7fa964858ac9fbc69f29fb716033c79e779f.scope - libcontainer container 85dec5b53d418ece8ad40adde34e7fa964858ac9fbc69f29fb716033c79e779f. May 27 03:52:15.265051 containerd[2800]: time="2025-05-27T03:52:15.265015367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4s89x,Uid:ce0aa6f9-8f5d-42b7-86dc-a7cd89a8f3f8,Namespace:kube-system,Attempt:0,} returns sandbox id \"85dec5b53d418ece8ad40adde34e7fa964858ac9fbc69f29fb716033c79e779f\"" May 27 03:52:15.267161 containerd[2800]: time="2025-05-27T03:52:15.267136463Z" level=info msg="CreateContainer within sandbox \"85dec5b53d418ece8ad40adde34e7fa964858ac9fbc69f29fb716033c79e779f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 03:52:15.272552 containerd[2800]: time="2025-05-27T03:52:15.272519462Z" level=info msg="Container 9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:15.276274 containerd[2800]: time="2025-05-27T03:52:15.276242090Z" level=info msg="CreateContainer within sandbox \"85dec5b53d418ece8ad40adde34e7fa964858ac9fbc69f29fb716033c79e779f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb\"" May 27 03:52:15.276619 containerd[2800]: time="2025-05-27T03:52:15.276601932Z" level=info msg="StartContainer for \"9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb\"" May 27 03:52:15.277874 containerd[2800]: time="2025-05-27T03:52:15.277850542Z" level=info msg="connecting to shim 9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb" address="unix:///run/containerd/s/bc1bc7cec9ff9cf294ba7f5c37ae32bddf08afc4a046e74886e44c07fbcce8ff" protocol=ttrpc version=3 May 27 03:52:15.288135 systemd[1]: Started cri-containerd-9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb.scope - libcontainer container 9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb. May 27 03:52:15.294610 containerd[2800]: time="2025-05-27T03:52:15.294569944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-xf9th,Uid:2f56cf2f-b280-4891-a007-525d5fd3f8b2,Namespace:tigera-operator,Attempt:0,}" May 27 03:52:15.302988 containerd[2800]: time="2025-05-27T03:52:15.302961806Z" level=info msg="connecting to shim 0bc5257561cfb6b4c6204291f74129ffb61dce61a5ad60c202ec4189c0a8e450" address="unix:///run/containerd/s/794f9577ac73e7579061127f2a0c08d80247a8025c587c5ddaa8d134d2dbdb7f" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:15.314749 containerd[2800]: time="2025-05-27T03:52:15.314721172Z" level=info msg="StartContainer for \"9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb\" returns successfully" May 27 03:52:15.339794 systemd[1]: Started cri-containerd-0bc5257561cfb6b4c6204291f74129ffb61dce61a5ad60c202ec4189c0a8e450.scope - libcontainer container 0bc5257561cfb6b4c6204291f74129ffb61dce61a5ad60c202ec4189c0a8e450. May 27 03:52:15.365575 containerd[2800]: time="2025-05-27T03:52:15.365545985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-xf9th,Uid:2f56cf2f-b280-4891-a007-525d5fd3f8b2,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0bc5257561cfb6b4c6204291f74129ffb61dce61a5ad60c202ec4189c0a8e450\"" May 27 03:52:15.366636 containerd[2800]: time="2025-05-27T03:52:15.366596432Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 03:52:15.372952 kubelet[4330]: I0527 03:52:15.372905 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-4s89x" podStartSLOduration=1.372889199 podStartE2EDuration="1.372889199s" podCreationTimestamp="2025-05-27 03:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:52:15.372748198 +0000 UTC m=+7.105586418" watchObservedRunningTime="2025-05-27 03:52:15.372889199 +0000 UTC m=+7.105727419" May 27 03:52:16.314699 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1374810595.mount: Deactivated successfully. May 27 03:52:19.785107 containerd[2800]: time="2025-05-27T03:52:19.785041571Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:19.785465 containerd[2800]: time="2025-05-27T03:52:19.785065812Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=22143480" May 27 03:52:19.786015 containerd[2800]: time="2025-05-27T03:52:19.785674615Z" level=info msg="ImageCreate event name:\"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:19.787448 containerd[2800]: time="2025-05-27T03:52:19.787421585Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:19.788155 containerd[2800]: time="2025-05-27T03:52:19.788119469Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"22139475\" in 4.421494956s" May 27 03:52:19.788155 containerd[2800]: time="2025-05-27T03:52:19.788156349Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\"" May 27 03:52:19.789739 containerd[2800]: time="2025-05-27T03:52:19.789714718Z" level=info msg="CreateContainer within sandbox \"0bc5257561cfb6b4c6204291f74129ffb61dce61a5ad60c202ec4189c0a8e450\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 03:52:19.793354 containerd[2800]: time="2025-05-27T03:52:19.793326098Z" level=info msg="Container 438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:19.795938 containerd[2800]: time="2025-05-27T03:52:19.795915273Z" level=info msg="CreateContainer within sandbox \"0bc5257561cfb6b4c6204291f74129ffb61dce61a5ad60c202ec4189c0a8e450\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56\"" May 27 03:52:19.796086 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2711096692.mount: Deactivated successfully. May 27 03:52:19.796340 containerd[2800]: time="2025-05-27T03:52:19.796226315Z" level=info msg="StartContainer for \"438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56\"" May 27 03:52:19.796960 containerd[2800]: time="2025-05-27T03:52:19.796937519Z" level=info msg="connecting to shim 438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56" address="unix:///run/containerd/s/794f9577ac73e7579061127f2a0c08d80247a8025c587c5ddaa8d134d2dbdb7f" protocol=ttrpc version=3 May 27 03:52:19.824776 systemd[1]: Started cri-containerd-438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56.scope - libcontainer container 438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56. May 27 03:52:19.844345 containerd[2800]: time="2025-05-27T03:52:19.844316227Z" level=info msg="StartContainer for \"438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56\" returns successfully" May 27 03:52:20.380714 kubelet[4330]: I0527 03:52:20.380658 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-xf9th" podStartSLOduration=1.9582733289999998 podStartE2EDuration="6.380644891s" podCreationTimestamp="2025-05-27 03:52:14 +0000 UTC" firstStartedPulling="2025-05-27 03:52:15.36627483 +0000 UTC m=+7.099113050" lastFinishedPulling="2025-05-27 03:52:19.788646392 +0000 UTC m=+11.521484612" observedRunningTime="2025-05-27 03:52:20.38049253 +0000 UTC m=+12.113330790" watchObservedRunningTime="2025-05-27 03:52:20.380644891 +0000 UTC m=+12.113483111" May 27 03:52:20.813743 update_engine[2788]: I20250527 03:52:20.813691 2788 update_attempter.cc:509] Updating boot flags... May 27 03:52:24.501761 sudo[3070]: pam_unix(sudo:session): session closed for user root May 27 03:52:24.565526 sshd[3069]: Connection closed by 139.178.89.65 port 40018 May 27 03:52:24.565898 sshd-session[3067]: pam_unix(sshd:session): session closed for user core May 27 03:52:24.568927 systemd[1]: sshd@6-147.28.163.138:22-139.178.89.65:40018.service: Deactivated successfully. May 27 03:52:24.571203 systemd[1]: session-9.scope: Deactivated successfully. May 27 03:52:24.571412 systemd[1]: session-9.scope: Consumed 8.129s CPU time, 247.2M memory peak. May 27 03:52:24.572449 systemd-logind[2785]: Session 9 logged out. Waiting for processes to exit. May 27 03:52:24.573272 systemd-logind[2785]: Removed session 9. May 27 03:52:27.917845 systemd[1]: Created slice kubepods-besteffort-podeafb2fad_4286_4cd0_8b79_bedc8307a072.slice - libcontainer container kubepods-besteffort-podeafb2fad_4286_4cd0_8b79_bedc8307a072.slice. May 27 03:52:27.960403 kubelet[4330]: I0527 03:52:27.960371 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/eafb2fad-4286-4cd0-8b79-bedc8307a072-typha-certs\") pod \"calico-typha-55f44dcfcd-4ldbr\" (UID: \"eafb2fad-4286-4cd0-8b79-bedc8307a072\") " pod="calico-system/calico-typha-55f44dcfcd-4ldbr" May 27 03:52:27.960403 kubelet[4330]: I0527 03:52:27.960405 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrgh\" (UniqueName: \"kubernetes.io/projected/eafb2fad-4286-4cd0-8b79-bedc8307a072-kube-api-access-7qrgh\") pod \"calico-typha-55f44dcfcd-4ldbr\" (UID: \"eafb2fad-4286-4cd0-8b79-bedc8307a072\") " pod="calico-system/calico-typha-55f44dcfcd-4ldbr" May 27 03:52:27.960744 kubelet[4330]: I0527 03:52:27.960426 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eafb2fad-4286-4cd0-8b79-bedc8307a072-tigera-ca-bundle\") pod \"calico-typha-55f44dcfcd-4ldbr\" (UID: \"eafb2fad-4286-4cd0-8b79-bedc8307a072\") " pod="calico-system/calico-typha-55f44dcfcd-4ldbr" May 27 03:52:28.163889 systemd[1]: Created slice kubepods-besteffort-pod1551c92f_b16c_4c84_97c1_e8305f3d8b8f.slice - libcontainer container kubepods-besteffort-pod1551c92f_b16c_4c84_97c1_e8305f3d8b8f.slice. May 27 03:52:28.221983 containerd[2800]: time="2025-05-27T03:52:28.221890167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55f44dcfcd-4ldbr,Uid:eafb2fad-4286-4cd0-8b79-bedc8307a072,Namespace:calico-system,Attempt:0,}" May 27 03:52:28.230603 containerd[2800]: time="2025-05-27T03:52:28.230570074Z" level=info msg="connecting to shim fe2bbdca4e3ff70882ba0b9402dd9f7fc749de191b765d033e42e188b60d5029" address="unix:///run/containerd/s/c9e73a2e9de4dcfdec5585f52a938d70c2a4bb02464abd5ce2456f0903599d1f" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:28.260915 systemd[1]: Started cri-containerd-fe2bbdca4e3ff70882ba0b9402dd9f7fc749de191b765d033e42e188b60d5029.scope - libcontainer container fe2bbdca4e3ff70882ba0b9402dd9f7fc749de191b765d033e42e188b60d5029. May 27 03:52:28.261128 kubelet[4330]: I0527 03:52:28.261103 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-var-lib-calico\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261170 kubelet[4330]: I0527 03:52:28.261138 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t789\" (UniqueName: \"kubernetes.io/projected/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-kube-api-access-7t789\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261170 kubelet[4330]: I0527 03:52:28.261157 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-policysync\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261211 kubelet[4330]: I0527 03:52:28.261174 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-lib-modules\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261211 kubelet[4330]: I0527 03:52:28.261192 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-node-certs\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261211 kubelet[4330]: I0527 03:52:28.261208 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-var-run-calico\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261273 kubelet[4330]: I0527 03:52:28.261225 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-xtables-lock\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261273 kubelet[4330]: I0527 03:52:28.261242 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-tigera-ca-bundle\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261273 kubelet[4330]: I0527 03:52:28.261262 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-cni-bin-dir\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261328 kubelet[4330]: I0527 03:52:28.261276 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-cni-log-dir\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261328 kubelet[4330]: I0527 03:52:28.261314 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-cni-net-dir\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.261367 kubelet[4330]: I0527 03:52:28.261334 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1551c92f-b16c-4c84-97c1-e8305f3d8b8f-flexvol-driver-host\") pod \"calico-node-8lnmb\" (UID: \"1551c92f-b16c-4c84-97c1-e8305f3d8b8f\") " pod="calico-system/calico-node-8lnmb" May 27 03:52:28.286585 containerd[2800]: time="2025-05-27T03:52:28.286548532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55f44dcfcd-4ldbr,Uid:eafb2fad-4286-4cd0-8b79-bedc8307a072,Namespace:calico-system,Attempt:0,} returns sandbox id \"fe2bbdca4e3ff70882ba0b9402dd9f7fc749de191b765d033e42e188b60d5029\"" May 27 03:52:28.287756 containerd[2800]: time="2025-05-27T03:52:28.287733655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 03:52:28.362735 kubelet[4330]: E0527 03:52:28.362707 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.362735 kubelet[4330]: W0527 03:52:28.362728 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.362852 kubelet[4330]: E0527 03:52:28.362748 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.364310 kubelet[4330]: E0527 03:52:28.364288 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.364310 kubelet[4330]: W0527 03:52:28.364305 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.364359 kubelet[4330]: E0527 03:52:28.364321 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.370144 kubelet[4330]: E0527 03:52:28.370126 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.370144 kubelet[4330]: W0527 03:52:28.370140 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.370190 kubelet[4330]: E0527 03:52:28.370154 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.406994 kubelet[4330]: E0527 03:52:28.406953 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qm7vf" podUID="ff1e6702-11d8-474c-81da-442634bba8ed" May 27 03:52:28.442277 kubelet[4330]: E0527 03:52:28.442253 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.442277 kubelet[4330]: W0527 03:52:28.442270 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.442360 kubelet[4330]: E0527 03:52:28.442288 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.442497 kubelet[4330]: E0527 03:52:28.442481 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.442543 kubelet[4330]: W0527 03:52:28.442491 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.442543 kubelet[4330]: E0527 03:52:28.442529 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.442721 kubelet[4330]: E0527 03:52:28.442709 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.442721 kubelet[4330]: W0527 03:52:28.442717 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.442763 kubelet[4330]: E0527 03:52:28.442725 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.442859 kubelet[4330]: E0527 03:52:28.442849 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.442859 kubelet[4330]: W0527 03:52:28.442856 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.442902 kubelet[4330]: E0527 03:52:28.442864 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.442999 kubelet[4330]: E0527 03:52:28.442988 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.442999 kubelet[4330]: W0527 03:52:28.442996 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.443037 kubelet[4330]: E0527 03:52:28.443003 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.443156 kubelet[4330]: E0527 03:52:28.443146 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.443156 kubelet[4330]: W0527 03:52:28.443153 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.443199 kubelet[4330]: E0527 03:52:28.443160 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.443374 kubelet[4330]: E0527 03:52:28.443363 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.443374 kubelet[4330]: W0527 03:52:28.443371 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.443412 kubelet[4330]: E0527 03:52:28.443379 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.443549 kubelet[4330]: E0527 03:52:28.443541 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.443570 kubelet[4330]: W0527 03:52:28.443549 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.443570 kubelet[4330]: E0527 03:52:28.443557 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.443718 kubelet[4330]: E0527 03:52:28.443709 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.443744 kubelet[4330]: W0527 03:52:28.443717 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.443744 kubelet[4330]: E0527 03:52:28.443725 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.443888 kubelet[4330]: E0527 03:52:28.443877 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.443914 kubelet[4330]: W0527 03:52:28.443888 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.443914 kubelet[4330]: E0527 03:52:28.443896 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.444065 kubelet[4330]: E0527 03:52:28.444057 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.444084 kubelet[4330]: W0527 03:52:28.444065 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.444084 kubelet[4330]: E0527 03:52:28.444072 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.444247 kubelet[4330]: E0527 03:52:28.444237 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.444247 kubelet[4330]: W0527 03:52:28.444244 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.444290 kubelet[4330]: E0527 03:52:28.444252 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.444394 kubelet[4330]: E0527 03:52:28.444383 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.444394 kubelet[4330]: W0527 03:52:28.444391 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.444434 kubelet[4330]: E0527 03:52:28.444399 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.444571 kubelet[4330]: E0527 03:52:28.444561 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.444571 kubelet[4330]: W0527 03:52:28.444569 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.444612 kubelet[4330]: E0527 03:52:28.444576 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.444745 kubelet[4330]: E0527 03:52:28.444735 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.444745 kubelet[4330]: W0527 03:52:28.444742 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.444786 kubelet[4330]: E0527 03:52:28.444749 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.444902 kubelet[4330]: E0527 03:52:28.444893 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.444925 kubelet[4330]: W0527 03:52:28.444901 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.444925 kubelet[4330]: E0527 03:52:28.444908 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.445039 kubelet[4330]: E0527 03:52:28.445031 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.445059 kubelet[4330]: W0527 03:52:28.445039 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.445059 kubelet[4330]: E0527 03:52:28.445049 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.445171 kubelet[4330]: E0527 03:52:28.445164 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.445194 kubelet[4330]: W0527 03:52:28.445171 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.445194 kubelet[4330]: E0527 03:52:28.445178 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.445297 kubelet[4330]: E0527 03:52:28.445290 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.445318 kubelet[4330]: W0527 03:52:28.445297 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.445318 kubelet[4330]: E0527 03:52:28.445304 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.445487 kubelet[4330]: E0527 03:52:28.445479 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.445510 kubelet[4330]: W0527 03:52:28.445487 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.445510 kubelet[4330]: E0527 03:52:28.445494 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.462777 kubelet[4330]: E0527 03:52:28.462755 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.462777 kubelet[4330]: W0527 03:52:28.462770 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.462848 kubelet[4330]: E0527 03:52:28.462785 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.462848 kubelet[4330]: I0527 03:52:28.462806 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ff1e6702-11d8-474c-81da-442634bba8ed-registration-dir\") pod \"csi-node-driver-qm7vf\" (UID: \"ff1e6702-11d8-474c-81da-442634bba8ed\") " pod="calico-system/csi-node-driver-qm7vf" May 27 03:52:28.462968 kubelet[4330]: E0527 03:52:28.462955 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.462968 kubelet[4330]: W0527 03:52:28.462965 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.463007 kubelet[4330]: E0527 03:52:28.462977 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.463007 kubelet[4330]: I0527 03:52:28.462992 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ff1e6702-11d8-474c-81da-442634bba8ed-socket-dir\") pod \"csi-node-driver-qm7vf\" (UID: \"ff1e6702-11d8-474c-81da-442634bba8ed\") " pod="calico-system/csi-node-driver-qm7vf" May 27 03:52:28.463177 kubelet[4330]: E0527 03:52:28.463168 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.463198 kubelet[4330]: W0527 03:52:28.463178 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.463198 kubelet[4330]: E0527 03:52:28.463190 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.463236 kubelet[4330]: I0527 03:52:28.463203 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqh77\" (UniqueName: \"kubernetes.io/projected/ff1e6702-11d8-474c-81da-442634bba8ed-kube-api-access-tqh77\") pod \"csi-node-driver-qm7vf\" (UID: \"ff1e6702-11d8-474c-81da-442634bba8ed\") " pod="calico-system/csi-node-driver-qm7vf" May 27 03:52:28.463411 kubelet[4330]: E0527 03:52:28.463401 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.463430 kubelet[4330]: W0527 03:52:28.463410 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.463430 kubelet[4330]: E0527 03:52:28.463421 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.463463 kubelet[4330]: I0527 03:52:28.463434 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff1e6702-11d8-474c-81da-442634bba8ed-kubelet-dir\") pod \"csi-node-driver-qm7vf\" (UID: \"ff1e6702-11d8-474c-81da-442634bba8ed\") " pod="calico-system/csi-node-driver-qm7vf" May 27 03:52:28.463653 kubelet[4330]: E0527 03:52:28.463644 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.463688 kubelet[4330]: W0527 03:52:28.463653 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.463688 kubelet[4330]: E0527 03:52:28.463663 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.463688 kubelet[4330]: I0527 03:52:28.463681 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ff1e6702-11d8-474c-81da-442634bba8ed-varrun\") pod \"csi-node-driver-qm7vf\" (UID: \"ff1e6702-11d8-474c-81da-442634bba8ed\") " pod="calico-system/csi-node-driver-qm7vf" May 27 03:52:28.463839 kubelet[4330]: E0527 03:52:28.463829 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.463862 kubelet[4330]: W0527 03:52:28.463838 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.463862 kubelet[4330]: E0527 03:52:28.463849 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.463979 kubelet[4330]: E0527 03:52:28.463971 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.464000 kubelet[4330]: W0527 03:52:28.463979 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.464000 kubelet[4330]: E0527 03:52:28.463988 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.464127 kubelet[4330]: E0527 03:52:28.464119 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.464148 kubelet[4330]: W0527 03:52:28.464127 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.464148 kubelet[4330]: E0527 03:52:28.464136 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.464307 kubelet[4330]: E0527 03:52:28.464299 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.464326 kubelet[4330]: W0527 03:52:28.464306 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.464326 kubelet[4330]: E0527 03:52:28.464320 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.464497 kubelet[4330]: E0527 03:52:28.464490 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.464563 kubelet[4330]: W0527 03:52:28.464497 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.464563 kubelet[4330]: E0527 03:52:28.464512 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.464668 kubelet[4330]: E0527 03:52:28.464660 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.464689 kubelet[4330]: W0527 03:52:28.464672 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.464713 kubelet[4330]: E0527 03:52:28.464687 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.464803 kubelet[4330]: E0527 03:52:28.464796 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.464827 kubelet[4330]: W0527 03:52:28.464803 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.464827 kubelet[4330]: E0527 03:52:28.464815 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.465025 kubelet[4330]: E0527 03:52:28.465018 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.465046 kubelet[4330]: W0527 03:52:28.465025 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.465046 kubelet[4330]: E0527 03:52:28.465032 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.465203 kubelet[4330]: E0527 03:52:28.465196 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.465224 kubelet[4330]: W0527 03:52:28.465202 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.465224 kubelet[4330]: E0527 03:52:28.465212 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.465372 kubelet[4330]: E0527 03:52:28.465364 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.465393 kubelet[4330]: W0527 03:52:28.465372 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.465393 kubelet[4330]: E0527 03:52:28.465379 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.465824 containerd[2800]: time="2025-05-27T03:52:28.465796700Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8lnmb,Uid:1551c92f-b16c-4c84-97c1-e8305f3d8b8f,Namespace:calico-system,Attempt:0,}" May 27 03:52:28.473919 containerd[2800]: time="2025-05-27T03:52:28.473864605Z" level=info msg="connecting to shim 0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0" address="unix:///run/containerd/s/a0618409e2be4f44a3fa09d8e8806d7701e7b02aee1c7f9ba0943975c6be5144" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:28.497848 systemd[1]: Started cri-containerd-0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0.scope - libcontainer container 0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0. May 27 03:52:28.515551 containerd[2800]: time="2025-05-27T03:52:28.515521017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8lnmb,Uid:1551c92f-b16c-4c84-97c1-e8305f3d8b8f,Namespace:calico-system,Attempt:0,} returns sandbox id \"0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0\"" May 27 03:52:28.564954 kubelet[4330]: E0527 03:52:28.564926 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.564954 kubelet[4330]: W0527 03:52:28.564948 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.565041 kubelet[4330]: E0527 03:52:28.564969 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.565217 kubelet[4330]: E0527 03:52:28.565206 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.565217 kubelet[4330]: W0527 03:52:28.565215 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.565256 kubelet[4330]: E0527 03:52:28.565226 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.565464 kubelet[4330]: E0527 03:52:28.565453 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.565464 kubelet[4330]: W0527 03:52:28.565461 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.565504 kubelet[4330]: E0527 03:52:28.565472 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.565677 kubelet[4330]: E0527 03:52:28.565662 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.565677 kubelet[4330]: W0527 03:52:28.565675 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.565719 kubelet[4330]: E0527 03:52:28.565686 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.565834 kubelet[4330]: E0527 03:52:28.565822 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.565834 kubelet[4330]: W0527 03:52:28.565831 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.565875 kubelet[4330]: E0527 03:52:28.565842 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.566017 kubelet[4330]: E0527 03:52:28.566008 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.566041 kubelet[4330]: W0527 03:52:28.566016 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.566041 kubelet[4330]: E0527 03:52:28.566028 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.566167 kubelet[4330]: E0527 03:52:28.566159 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.566190 kubelet[4330]: W0527 03:52:28.566166 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.566209 kubelet[4330]: E0527 03:52:28.566191 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.566377 kubelet[4330]: E0527 03:52:28.566368 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.566396 kubelet[4330]: W0527 03:52:28.566377 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.566415 kubelet[4330]: E0527 03:52:28.566402 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.566535 kubelet[4330]: E0527 03:52:28.566528 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.566557 kubelet[4330]: W0527 03:52:28.566535 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.566603 kubelet[4330]: E0527 03:52:28.566573 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.566743 kubelet[4330]: E0527 03:52:28.566735 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.566764 kubelet[4330]: W0527 03:52:28.566742 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.566764 kubelet[4330]: E0527 03:52:28.566754 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.566885 kubelet[4330]: E0527 03:52:28.566875 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.566885 kubelet[4330]: W0527 03:52:28.566884 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.566928 kubelet[4330]: E0527 03:52:28.566895 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.567060 kubelet[4330]: E0527 03:52:28.567050 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.567060 kubelet[4330]: W0527 03:52:28.567057 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.567099 kubelet[4330]: E0527 03:52:28.567078 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.567228 kubelet[4330]: E0527 03:52:28.567219 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.567228 kubelet[4330]: W0527 03:52:28.567226 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.567269 kubelet[4330]: E0527 03:52:28.567241 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.567398 kubelet[4330]: E0527 03:52:28.567388 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.567398 kubelet[4330]: W0527 03:52:28.567396 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.567435 kubelet[4330]: E0527 03:52:28.567411 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.567600 kubelet[4330]: E0527 03:52:28.567593 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.567623 kubelet[4330]: W0527 03:52:28.567600 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.567623 kubelet[4330]: E0527 03:52:28.567611 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.567833 kubelet[4330]: E0527 03:52:28.567822 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.567833 kubelet[4330]: W0527 03:52:28.567829 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.567871 kubelet[4330]: E0527 03:52:28.567847 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.567957 kubelet[4330]: E0527 03:52:28.567949 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.567981 kubelet[4330]: W0527 03:52:28.567957 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.568002 kubelet[4330]: E0527 03:52:28.567979 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.568198 kubelet[4330]: E0527 03:52:28.568189 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.568217 kubelet[4330]: W0527 03:52:28.568198 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.568217 kubelet[4330]: E0527 03:52:28.568209 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.568430 kubelet[4330]: E0527 03:52:28.568419 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.568430 kubelet[4330]: W0527 03:52:28.568427 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.568476 kubelet[4330]: E0527 03:52:28.568438 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.568621 kubelet[4330]: E0527 03:52:28.568611 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.568621 kubelet[4330]: W0527 03:52:28.568618 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.568658 kubelet[4330]: E0527 03:52:28.568641 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.568773 kubelet[4330]: E0527 03:52:28.568761 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.568773 kubelet[4330]: W0527 03:52:28.568769 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.568820 kubelet[4330]: E0527 03:52:28.568790 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.568918 kubelet[4330]: E0527 03:52:28.568907 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.568918 kubelet[4330]: W0527 03:52:28.568915 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.568958 kubelet[4330]: E0527 03:52:28.568926 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.569067 kubelet[4330]: E0527 03:52:28.569060 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.569091 kubelet[4330]: W0527 03:52:28.569067 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.569112 kubelet[4330]: E0527 03:52:28.569079 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.569309 kubelet[4330]: E0527 03:52:28.569299 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.569328 kubelet[4330]: W0527 03:52:28.569308 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.569328 kubelet[4330]: E0527 03:52:28.569321 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.569549 kubelet[4330]: E0527 03:52:28.569537 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.569549 kubelet[4330]: W0527 03:52:28.569546 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.569590 kubelet[4330]: E0527 03:52:28.569556 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:28.577345 kubelet[4330]: E0527 03:52:28.577328 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:28.577372 kubelet[4330]: W0527 03:52:28.577343 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:28.577372 kubelet[4330]: E0527 03:52:28.577357 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.204341 containerd[2800]: time="2025-05-27T03:52:29.204291160Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:29.204341 containerd[2800]: time="2025-05-27T03:52:29.204341680Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=33020269" May 27 03:52:29.205016 containerd[2800]: time="2025-05-27T03:52:29.204993842Z" level=info msg="ImageCreate event name:\"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:29.206411 containerd[2800]: time="2025-05-27T03:52:29.206385606Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:29.206991 containerd[2800]: time="2025-05-27T03:52:29.206967648Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"33020123\" in 919.205193ms" May 27 03:52:29.207015 containerd[2800]: time="2025-05-27T03:52:29.206997648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\"" May 27 03:52:29.207684 containerd[2800]: time="2025-05-27T03:52:29.207658810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 03:52:29.212501 containerd[2800]: time="2025-05-27T03:52:29.212470824Z" level=info msg="CreateContainer within sandbox \"fe2bbdca4e3ff70882ba0b9402dd9f7fc749de191b765d033e42e188b60d5029\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 03:52:29.216274 containerd[2800]: time="2025-05-27T03:52:29.216236995Z" level=info msg="Container 65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:29.219507 containerd[2800]: time="2025-05-27T03:52:29.219473605Z" level=info msg="CreateContainer within sandbox \"fe2bbdca4e3ff70882ba0b9402dd9f7fc749de191b765d033e42e188b60d5029\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b\"" May 27 03:52:29.219865 containerd[2800]: time="2025-05-27T03:52:29.219841766Z" level=info msg="StartContainer for \"65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b\"" May 27 03:52:29.220827 containerd[2800]: time="2025-05-27T03:52:29.220800049Z" level=info msg="connecting to shim 65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b" address="unix:///run/containerd/s/c9e73a2e9de4dcfdec5585f52a938d70c2a4bb02464abd5ce2456f0903599d1f" protocol=ttrpc version=3 May 27 03:52:29.253846 systemd[1]: Started cri-containerd-65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b.scope - libcontainer container 65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b. May 27 03:52:29.282265 containerd[2800]: time="2025-05-27T03:52:29.282237512Z" level=info msg="StartContainer for \"65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b\" returns successfully" May 27 03:52:29.395023 kubelet[4330]: I0527 03:52:29.394959 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-55f44dcfcd-4ldbr" podStartSLOduration=1.4748949709999999 podStartE2EDuration="2.394943566s" podCreationTimestamp="2025-05-27 03:52:27 +0000 UTC" firstStartedPulling="2025-05-27 03:52:28.287488615 +0000 UTC m=+20.020326835" lastFinishedPulling="2025-05-27 03:52:29.20753721 +0000 UTC m=+20.940375430" observedRunningTime="2025-05-27 03:52:29.394901646 +0000 UTC m=+21.127739866" watchObservedRunningTime="2025-05-27 03:52:29.394943566 +0000 UTC m=+21.127781786" May 27 03:52:29.453247 kubelet[4330]: E0527 03:52:29.453213 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.453247 kubelet[4330]: W0527 03:52:29.453236 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.453247 kubelet[4330]: E0527 03:52:29.453258 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.453549 kubelet[4330]: E0527 03:52:29.453447 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.453549 kubelet[4330]: W0527 03:52:29.453454 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.453549 kubelet[4330]: E0527 03:52:29.453497 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.453686 kubelet[4330]: E0527 03:52:29.453674 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.453686 kubelet[4330]: W0527 03:52:29.453684 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.453731 kubelet[4330]: E0527 03:52:29.453692 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.453877 kubelet[4330]: E0527 03:52:29.453868 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.453896 kubelet[4330]: W0527 03:52:29.453876 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.453896 kubelet[4330]: E0527 03:52:29.453883 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.454053 kubelet[4330]: E0527 03:52:29.454045 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.454075 kubelet[4330]: W0527 03:52:29.454053 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.454075 kubelet[4330]: E0527 03:52:29.454060 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.454220 kubelet[4330]: E0527 03:52:29.454212 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.454242 kubelet[4330]: W0527 03:52:29.454219 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.454242 kubelet[4330]: E0527 03:52:29.454226 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.454421 kubelet[4330]: E0527 03:52:29.454378 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.454421 kubelet[4330]: W0527 03:52:29.454387 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.454421 kubelet[4330]: E0527 03:52:29.454394 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.454577 kubelet[4330]: E0527 03:52:29.454544 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.454577 kubelet[4330]: W0527 03:52:29.454556 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.454577 kubelet[4330]: E0527 03:52:29.454563 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.454861 kubelet[4330]: E0527 03:52:29.454848 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.454891 kubelet[4330]: W0527 03:52:29.454861 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.454891 kubelet[4330]: E0527 03:52:29.454872 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.455028 kubelet[4330]: E0527 03:52:29.455016 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.455028 kubelet[4330]: W0527 03:52:29.455025 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.455076 kubelet[4330]: E0527 03:52:29.455033 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.455177 kubelet[4330]: E0527 03:52:29.455160 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.455177 kubelet[4330]: W0527 03:52:29.455168 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.455177 kubelet[4330]: E0527 03:52:29.455176 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.455301 kubelet[4330]: E0527 03:52:29.455292 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.455301 kubelet[4330]: W0527 03:52:29.455299 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.455345 kubelet[4330]: E0527 03:52:29.455307 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.455437 kubelet[4330]: E0527 03:52:29.455429 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.455437 kubelet[4330]: W0527 03:52:29.455436 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.455497 kubelet[4330]: E0527 03:52:29.455442 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.455567 kubelet[4330]: E0527 03:52:29.455559 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.455567 kubelet[4330]: W0527 03:52:29.455566 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.455613 kubelet[4330]: E0527 03:52:29.455573 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.455725 kubelet[4330]: E0527 03:52:29.455716 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.455725 kubelet[4330]: W0527 03:52:29.455724 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.455776 kubelet[4330]: E0527 03:52:29.455733 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.472075 kubelet[4330]: E0527 03:52:29.472060 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.472106 kubelet[4330]: W0527 03:52:29.472075 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.472106 kubelet[4330]: E0527 03:52:29.472089 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.472272 kubelet[4330]: E0527 03:52:29.472262 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.472307 kubelet[4330]: W0527 03:52:29.472272 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.472307 kubelet[4330]: E0527 03:52:29.472284 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.472434 kubelet[4330]: E0527 03:52:29.472425 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.472456 kubelet[4330]: W0527 03:52:29.472434 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.472456 kubelet[4330]: E0527 03:52:29.472446 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.472692 kubelet[4330]: E0527 03:52:29.472683 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.472717 kubelet[4330]: W0527 03:52:29.472692 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.472717 kubelet[4330]: E0527 03:52:29.472705 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.472930 kubelet[4330]: E0527 03:52:29.472922 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.472951 kubelet[4330]: W0527 03:52:29.472930 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.472951 kubelet[4330]: E0527 03:52:29.472941 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.473121 kubelet[4330]: E0527 03:52:29.473114 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.473143 kubelet[4330]: W0527 03:52:29.473121 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.473143 kubelet[4330]: E0527 03:52:29.473132 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.473280 kubelet[4330]: E0527 03:52:29.473272 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.473302 kubelet[4330]: W0527 03:52:29.473280 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.473320 kubelet[4330]: E0527 03:52:29.473299 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.473398 kubelet[4330]: E0527 03:52:29.473390 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.473419 kubelet[4330]: W0527 03:52:29.473397 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.473445 kubelet[4330]: E0527 03:52:29.473420 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.473528 kubelet[4330]: E0527 03:52:29.473519 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.473528 kubelet[4330]: W0527 03:52:29.473526 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.473570 kubelet[4330]: E0527 03:52:29.473536 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.473724 kubelet[4330]: E0527 03:52:29.473712 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.473751 kubelet[4330]: W0527 03:52:29.473723 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.473751 kubelet[4330]: E0527 03:52:29.473737 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.473861 kubelet[4330]: E0527 03:52:29.473853 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.473880 kubelet[4330]: W0527 03:52:29.473860 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.473880 kubelet[4330]: E0527 03:52:29.473870 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.474081 kubelet[4330]: E0527 03:52:29.474073 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.474103 kubelet[4330]: W0527 03:52:29.474081 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.474103 kubelet[4330]: E0527 03:52:29.474092 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.474204 kubelet[4330]: E0527 03:52:29.474197 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.474224 kubelet[4330]: W0527 03:52:29.474204 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.474224 kubelet[4330]: E0527 03:52:29.474214 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.474371 kubelet[4330]: E0527 03:52:29.474363 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.474393 kubelet[4330]: W0527 03:52:29.474371 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.474393 kubelet[4330]: E0527 03:52:29.474381 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.474669 kubelet[4330]: E0527 03:52:29.474653 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.474689 kubelet[4330]: W0527 03:52:29.474672 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.474708 kubelet[4330]: E0527 03:52:29.474689 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.474852 kubelet[4330]: E0527 03:52:29.474845 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.474879 kubelet[4330]: W0527 03:52:29.474853 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.474879 kubelet[4330]: E0527 03:52:29.474864 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.475065 kubelet[4330]: E0527 03:52:29.475055 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.475086 kubelet[4330]: W0527 03:52:29.475065 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.475086 kubelet[4330]: E0527 03:52:29.475078 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.475306 kubelet[4330]: E0527 03:52:29.475296 4330 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:52:29.475328 kubelet[4330]: W0527 03:52:29.475306 4330 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:52:29.475328 kubelet[4330]: E0527 03:52:29.475315 4330 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:52:29.555590 containerd[2800]: time="2025-05-27T03:52:29.555548924Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:29.555655 containerd[2800]: time="2025-05-27T03:52:29.555587804Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4264304" May 27 03:52:29.556244 containerd[2800]: time="2025-05-27T03:52:29.556224206Z" level=info msg="ImageCreate event name:\"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:29.557783 containerd[2800]: time="2025-05-27T03:52:29.557758370Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:29.558398 containerd[2800]: time="2025-05-27T03:52:29.558371892Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5633505\" in 350.675962ms" May 27 03:52:29.558421 containerd[2800]: time="2025-05-27T03:52:29.558404612Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\"" May 27 03:52:29.559980 containerd[2800]: time="2025-05-27T03:52:29.559961137Z" level=info msg="CreateContainer within sandbox \"0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 03:52:29.566573 containerd[2800]: time="2025-05-27T03:52:29.566542956Z" level=info msg="Container 33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:29.570359 containerd[2800]: time="2025-05-27T03:52:29.570330568Z" level=info msg="CreateContainer within sandbox \"0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0\"" May 27 03:52:29.570702 containerd[2800]: time="2025-05-27T03:52:29.570682329Z" level=info msg="StartContainer for \"33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0\"" May 27 03:52:29.571943 containerd[2800]: time="2025-05-27T03:52:29.571923932Z" level=info msg="connecting to shim 33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0" address="unix:///run/containerd/s/a0618409e2be4f44a3fa09d8e8806d7701e7b02aee1c7f9ba0943975c6be5144" protocol=ttrpc version=3 May 27 03:52:29.601806 systemd[1]: Started cri-containerd-33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0.scope - libcontainer container 33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0. May 27 03:52:29.628700 containerd[2800]: time="2025-05-27T03:52:29.628670221Z" level=info msg="StartContainer for \"33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0\" returns successfully" May 27 03:52:29.638823 systemd[1]: cri-containerd-33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0.scope: Deactivated successfully. May 27 03:52:29.640907 containerd[2800]: time="2025-05-27T03:52:29.640878057Z" level=info msg="received exit event container_id:\"33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0\" id:\"33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0\" pid:5409 exited_at:{seconds:1748317949 nanos:640593216}" May 27 03:52:29.640991 containerd[2800]: time="2025-05-27T03:52:29.640965817Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0\" id:\"33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0\" pid:5409 exited_at:{seconds:1748317949 nanos:640593216}" May 27 03:52:30.352650 kubelet[4330]: E0527 03:52:30.352619 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qm7vf" podUID="ff1e6702-11d8-474c-81da-442634bba8ed" May 27 03:52:30.392024 kubelet[4330]: I0527 03:52:30.391996 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:52:30.392859 containerd[2800]: time="2025-05-27T03:52:30.392834059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 03:52:31.574390 containerd[2800]: time="2025-05-27T03:52:31.574356770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:31.574698 containerd[2800]: time="2025-05-27T03:52:31.574409770Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=65748976" May 27 03:52:31.575005 containerd[2800]: time="2025-05-27T03:52:31.574985372Z" level=info msg="ImageCreate event name:\"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:31.576502 containerd[2800]: time="2025-05-27T03:52:31.576485256Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:31.577062 containerd[2800]: time="2025-05-27T03:52:31.577040337Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"67118217\" in 1.184175918s" May 27 03:52:31.577086 containerd[2800]: time="2025-05-27T03:52:31.577068377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\"" May 27 03:52:31.578811 containerd[2800]: time="2025-05-27T03:52:31.578790942Z" level=info msg="CreateContainer within sandbox \"0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 03:52:31.584549 containerd[2800]: time="2025-05-27T03:52:31.583403394Z" level=info msg="Container 434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:31.589674 containerd[2800]: time="2025-05-27T03:52:31.589635370Z" level=info msg="CreateContainer within sandbox \"0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b\"" May 27 03:52:31.590075 containerd[2800]: time="2025-05-27T03:52:31.590047091Z" level=info msg="StartContainer for \"434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b\"" May 27 03:52:31.591397 containerd[2800]: time="2025-05-27T03:52:31.591372614Z" level=info msg="connecting to shim 434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b" address="unix:///run/containerd/s/a0618409e2be4f44a3fa09d8e8806d7701e7b02aee1c7f9ba0943975c6be5144" protocol=ttrpc version=3 May 27 03:52:31.617781 systemd[1]: Started cri-containerd-434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b.scope - libcontainer container 434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b. May 27 03:52:31.645700 containerd[2800]: time="2025-05-27T03:52:31.645676476Z" level=info msg="StartContainer for \"434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b\" returns successfully" May 27 03:52:32.026628 containerd[2800]: time="2025-05-27T03:52:32.026593827Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 03:52:32.028422 systemd[1]: cri-containerd-434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b.scope: Deactivated successfully. May 27 03:52:32.028741 systemd[1]: cri-containerd-434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b.scope: Consumed 906ms CPU time, 197.8M memory peak, 165.5M written to disk. May 27 03:52:32.029251 containerd[2800]: time="2025-05-27T03:52:32.029223793Z" level=info msg="received exit event container_id:\"434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b\" id:\"434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b\" pid:5473 exited_at:{seconds:1748317952 nanos:29096113}" May 27 03:52:32.029349 containerd[2800]: time="2025-05-27T03:52:32.029329513Z" level=info msg="TaskExit event in podsandbox handler container_id:\"434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b\" id:\"434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b\" pid:5473 exited_at:{seconds:1748317952 nanos:29096113}" May 27 03:52:32.045268 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b-rootfs.mount: Deactivated successfully. May 27 03:52:32.059779 kubelet[4330]: I0527 03:52:32.059751 4330 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 03:52:32.078052 systemd[1]: Created slice kubepods-burstable-pod7b582b11_35a7_4cd7_a048_ee32ed51f7c8.slice - libcontainer container kubepods-burstable-pod7b582b11_35a7_4cd7_a048_ee32ed51f7c8.slice. May 27 03:52:32.082764 systemd[1]: Created slice kubepods-burstable-podcb4b90f2_6dd3_4625_bfc6_cf0978ad54c8.slice - libcontainer container kubepods-burstable-podcb4b90f2_6dd3_4625_bfc6_cf0978ad54c8.slice. May 27 03:52:32.086719 systemd[1]: Created slice kubepods-besteffort-pod3619f166_91f2_40b2_bad8_3ee4a2356c2c.slice - libcontainer container kubepods-besteffort-pod3619f166_91f2_40b2_bad8_3ee4a2356c2c.slice. May 27 03:52:32.093553 systemd[1]: Created slice kubepods-besteffort-pode64f219b_1dd1_4fcf_9ea3_0aee89fc3d09.slice - libcontainer container kubepods-besteffort-pode64f219b_1dd1_4fcf_9ea3_0aee89fc3d09.slice. May 27 03:52:32.102123 systemd[1]: Created slice kubepods-besteffort-pod073a67af_2b67_4e56_b118_5e4b8a92e30f.slice - libcontainer container kubepods-besteffort-pod073a67af_2b67_4e56_b118_5e4b8a92e30f.slice. May 27 03:52:32.112109 systemd[1]: Created slice kubepods-besteffort-podd96e496f_757b_4edd_abf6_47a9afc6e5a2.slice - libcontainer container kubepods-besteffort-podd96e496f_757b_4edd_abf6_47a9afc6e5a2.slice. May 27 03:52:32.115371 systemd[1]: Created slice kubepods-besteffort-pod5718ef86_6991_4e86_870c_b776228460c0.slice - libcontainer container kubepods-besteffort-pod5718ef86_6991_4e86_870c_b776228460c0.slice. May 27 03:52:32.185756 kubelet[4330]: I0527 03:52:32.185716 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4zd2\" (UniqueName: \"kubernetes.io/projected/3619f166-91f2-40b2-bad8-3ee4a2356c2c-kube-api-access-n4zd2\") pod \"calico-kube-controllers-65f67dbd-fgvq5\" (UID: \"3619f166-91f2-40b2-bad8-3ee4a2356c2c\") " pod="calico-system/calico-kube-controllers-65f67dbd-fgvq5" May 27 03:52:32.185756 kubelet[4330]: I0527 03:52:32.185755 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/073a67af-2b67-4e56-b118-5e4b8a92e30f-calico-apiserver-certs\") pod \"calico-apiserver-658b59d8b7-cq2pw\" (UID: \"073a67af-2b67-4e56-b118-5e4b8a92e30f\") " pod="calico-apiserver/calico-apiserver-658b59d8b7-cq2pw" May 27 03:52:32.185975 kubelet[4330]: I0527 03:52:32.185773 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnzp6\" (UniqueName: \"kubernetes.io/projected/073a67af-2b67-4e56-b118-5e4b8a92e30f-kube-api-access-cnzp6\") pod \"calico-apiserver-658b59d8b7-cq2pw\" (UID: \"073a67af-2b67-4e56-b118-5e4b8a92e30f\") " pod="calico-apiserver/calico-apiserver-658b59d8b7-cq2pw" May 27 03:52:32.185975 kubelet[4330]: I0527 03:52:32.185789 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b582b11-35a7-4cd7-a048-ee32ed51f7c8-config-volume\") pod \"coredns-668d6bf9bc-8jk2j\" (UID: \"7b582b11-35a7-4cd7-a048-ee32ed51f7c8\") " pod="kube-system/coredns-668d6bf9bc-8jk2j" May 27 03:52:32.185975 kubelet[4330]: I0527 03:52:32.185882 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmws\" (UniqueName: \"kubernetes.io/projected/7b582b11-35a7-4cd7-a048-ee32ed51f7c8-kube-api-access-8bmws\") pod \"coredns-668d6bf9bc-8jk2j\" (UID: \"7b582b11-35a7-4cd7-a048-ee32ed51f7c8\") " pod="kube-system/coredns-668d6bf9bc-8jk2j" May 27 03:52:32.185975 kubelet[4330]: I0527 03:52:32.185941 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s42v\" (UniqueName: \"kubernetes.io/projected/d96e496f-757b-4edd-abf6-47a9afc6e5a2-kube-api-access-7s42v\") pod \"whisker-64c9f55499-drhl6\" (UID: \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\") " pod="calico-system/whisker-64c9f55499-drhl6" May 27 03:52:32.186096 kubelet[4330]: I0527 03:52:32.186034 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwfzv\" (UniqueName: \"kubernetes.io/projected/e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09-kube-api-access-wwfzv\") pod \"calico-apiserver-658b59d8b7-5zqdk\" (UID: \"e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09\") " pod="calico-apiserver/calico-apiserver-658b59d8b7-5zqdk" May 27 03:52:32.186096 kubelet[4330]: I0527 03:52:32.186068 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtzsp\" (UniqueName: \"kubernetes.io/projected/cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8-kube-api-access-gtzsp\") pod \"coredns-668d6bf9bc-2x48w\" (UID: \"cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8\") " pod="kube-system/coredns-668d6bf9bc-2x48w" May 27 03:52:32.186096 kubelet[4330]: I0527 03:52:32.186084 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5718ef86-6991-4e86-870c-b776228460c0-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-tt8gf\" (UID: \"5718ef86-6991-4e86-870c-b776228460c0\") " pod="calico-system/goldmane-78d55f7ddc-tt8gf" May 27 03:52:32.186153 kubelet[4330]: I0527 03:52:32.186121 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d96e496f-757b-4edd-abf6-47a9afc6e5a2-whisker-ca-bundle\") pod \"whisker-64c9f55499-drhl6\" (UID: \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\") " pod="calico-system/whisker-64c9f55499-drhl6" May 27 03:52:32.186216 kubelet[4330]: I0527 03:52:32.186195 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5718ef86-6991-4e86-870c-b776228460c0-config\") pod \"goldmane-78d55f7ddc-tt8gf\" (UID: \"5718ef86-6991-4e86-870c-b776228460c0\") " pod="calico-system/goldmane-78d55f7ddc-tt8gf" May 27 03:52:32.186243 kubelet[4330]: I0527 03:52:32.186220 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09-calico-apiserver-certs\") pod \"calico-apiserver-658b59d8b7-5zqdk\" (UID: \"e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09\") " pod="calico-apiserver/calico-apiserver-658b59d8b7-5zqdk" May 27 03:52:32.186243 kubelet[4330]: I0527 03:52:32.186239 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8-config-volume\") pod \"coredns-668d6bf9bc-2x48w\" (UID: \"cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8\") " pod="kube-system/coredns-668d6bf9bc-2x48w" May 27 03:52:32.186280 kubelet[4330]: I0527 03:52:32.186254 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/5718ef86-6991-4e86-870c-b776228460c0-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-tt8gf\" (UID: \"5718ef86-6991-4e86-870c-b776228460c0\") " pod="calico-system/goldmane-78d55f7ddc-tt8gf" May 27 03:52:32.186329 kubelet[4330]: I0527 03:52:32.186314 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3619f166-91f2-40b2-bad8-3ee4a2356c2c-tigera-ca-bundle\") pod \"calico-kube-controllers-65f67dbd-fgvq5\" (UID: \"3619f166-91f2-40b2-bad8-3ee4a2356c2c\") " pod="calico-system/calico-kube-controllers-65f67dbd-fgvq5" May 27 03:52:32.186356 kubelet[4330]: I0527 03:52:32.186346 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5z2l\" (UniqueName: \"kubernetes.io/projected/5718ef86-6991-4e86-870c-b776228460c0-kube-api-access-h5z2l\") pod \"goldmane-78d55f7ddc-tt8gf\" (UID: \"5718ef86-6991-4e86-870c-b776228460c0\") " pod="calico-system/goldmane-78d55f7ddc-tt8gf" May 27 03:52:32.186376 kubelet[4330]: I0527 03:52:32.186368 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d96e496f-757b-4edd-abf6-47a9afc6e5a2-whisker-backend-key-pair\") pod \"whisker-64c9f55499-drhl6\" (UID: \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\") " pod="calico-system/whisker-64c9f55499-drhl6" May 27 03:52:32.354707 systemd[1]: Created slice kubepods-besteffort-podff1e6702_11d8_474c_81da_442634bba8ed.slice - libcontainer container kubepods-besteffort-podff1e6702_11d8_474c_81da_442634bba8ed.slice. May 27 03:52:32.356436 containerd[2800]: time="2025-05-27T03:52:32.356393874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qm7vf,Uid:ff1e6702-11d8-474c-81da-442634bba8ed,Namespace:calico-system,Attempt:0,}" May 27 03:52:32.381103 containerd[2800]: time="2025-05-27T03:52:32.381072535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8jk2j,Uid:7b582b11-35a7-4cd7-a048-ee32ed51f7c8,Namespace:kube-system,Attempt:0,}" May 27 03:52:32.385553 containerd[2800]: time="2025-05-27T03:52:32.385528826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2x48w,Uid:cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8,Namespace:kube-system,Attempt:0,}" May 27 03:52:32.389088 containerd[2800]: time="2025-05-27T03:52:32.389056794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65f67dbd-fgvq5,Uid:3619f166-91f2-40b2-bad8-3ee4a2356c2c,Namespace:calico-system,Attempt:0,}" May 27 03:52:32.397302 containerd[2800]: time="2025-05-27T03:52:32.397276334Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 03:52:32.398715 containerd[2800]: time="2025-05-27T03:52:32.398691098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658b59d8b7-5zqdk,Uid:e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09,Namespace:calico-apiserver,Attempt:0,}" May 27 03:52:32.406013 containerd[2800]: time="2025-05-27T03:52:32.405987916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658b59d8b7-cq2pw,Uid:073a67af-2b67-4e56-b118-5e4b8a92e30f,Namespace:calico-apiserver,Attempt:0,}" May 27 03:52:32.414921 containerd[2800]: time="2025-05-27T03:52:32.414884777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64c9f55499-drhl6,Uid:d96e496f-757b-4edd-abf6-47a9afc6e5a2,Namespace:calico-system,Attempt:0,}" May 27 03:52:32.417419 containerd[2800]: time="2025-05-27T03:52:32.417391944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-tt8gf,Uid:5718ef86-6991-4e86-870c-b776228460c0,Namespace:calico-system,Attempt:0,}" May 27 03:52:32.419172 containerd[2800]: time="2025-05-27T03:52:32.419138308Z" level=error msg="Failed to destroy network for sandbox \"be90b9554214e03130e75dbbef04b6153233ec9bc4e460a39196099dbe7fa1ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.419711 containerd[2800]: time="2025-05-27T03:52:32.419673829Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qm7vf,Uid:ff1e6702-11d8-474c-81da-442634bba8ed,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"be90b9554214e03130e75dbbef04b6153233ec9bc4e460a39196099dbe7fa1ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.419869 kubelet[4330]: E0527 03:52:32.419834 4330 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be90b9554214e03130e75dbbef04b6153233ec9bc4e460a39196099dbe7fa1ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.419924 kubelet[4330]: E0527 03:52:32.419907 4330 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be90b9554214e03130e75dbbef04b6153233ec9bc4e460a39196099dbe7fa1ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-qm7vf" May 27 03:52:32.419945 kubelet[4330]: E0527 03:52:32.419929 4330 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be90b9554214e03130e75dbbef04b6153233ec9bc4e460a39196099dbe7fa1ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-qm7vf" May 27 03:52:32.419990 kubelet[4330]: E0527 03:52:32.419971 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-qm7vf_calico-system(ff1e6702-11d8-474c-81da-442634bba8ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-qm7vf_calico-system(ff1e6702-11d8-474c-81da-442634bba8ed)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"be90b9554214e03130e75dbbef04b6153233ec9bc4e460a39196099dbe7fa1ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-qm7vf" podUID="ff1e6702-11d8-474c-81da-442634bba8ed" May 27 03:52:32.438216 containerd[2800]: time="2025-05-27T03:52:32.438177274Z" level=error msg="Failed to destroy network for sandbox \"6b21013811261e00cca14e79668305a30bf00e093abb2c885fff1c39855cf1de\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.438747 containerd[2800]: time="2025-05-27T03:52:32.438716716Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8jk2j,Uid:7b582b11-35a7-4cd7-a048-ee32ed51f7c8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b21013811261e00cca14e79668305a30bf00e093abb2c885fff1c39855cf1de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.438945 kubelet[4330]: E0527 03:52:32.438908 4330 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b21013811261e00cca14e79668305a30bf00e093abb2c885fff1c39855cf1de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.438992 kubelet[4330]: E0527 03:52:32.438963 4330 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b21013811261e00cca14e79668305a30bf00e093abb2c885fff1c39855cf1de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-8jk2j" May 27 03:52:32.438992 kubelet[4330]: E0527 03:52:32.438982 4330 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b21013811261e00cca14e79668305a30bf00e093abb2c885fff1c39855cf1de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-8jk2j" May 27 03:52:32.439044 kubelet[4330]: E0527 03:52:32.439022 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-8jk2j_kube-system(7b582b11-35a7-4cd7-a048-ee32ed51f7c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-8jk2j_kube-system(7b582b11-35a7-4cd7-a048-ee32ed51f7c8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b21013811261e00cca14e79668305a30bf00e093abb2c885fff1c39855cf1de\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-8jk2j" podUID="7b582b11-35a7-4cd7-a048-ee32ed51f7c8" May 27 03:52:32.450294 containerd[2800]: time="2025-05-27T03:52:32.450242264Z" level=error msg="Failed to destroy network for sandbox \"b6a60af31e3a13f125af65de6fc143d0bddf3141748462d297a7d33aff984891\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.450979 containerd[2800]: time="2025-05-27T03:52:32.450889586Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658b59d8b7-5zqdk,Uid:e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6a60af31e3a13f125af65de6fc143d0bddf3141748462d297a7d33aff984891\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.451143 kubelet[4330]: E0527 03:52:32.451098 4330 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6a60af31e3a13f125af65de6fc143d0bddf3141748462d297a7d33aff984891\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.451190 kubelet[4330]: E0527 03:52:32.451169 4330 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6a60af31e3a13f125af65de6fc143d0bddf3141748462d297a7d33aff984891\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-658b59d8b7-5zqdk" May 27 03:52:32.451212 kubelet[4330]: E0527 03:52:32.451189 4330 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6a60af31e3a13f125af65de6fc143d0bddf3141748462d297a7d33aff984891\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-658b59d8b7-5zqdk" May 27 03:52:32.451257 kubelet[4330]: E0527 03:52:32.451232 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-658b59d8b7-5zqdk_calico-apiserver(e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-658b59d8b7-5zqdk_calico-apiserver(e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b6a60af31e3a13f125af65de6fc143d0bddf3141748462d297a7d33aff984891\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-658b59d8b7-5zqdk" podUID="e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09" May 27 03:52:32.451694 containerd[2800]: time="2025-05-27T03:52:32.451664107Z" level=error msg="Failed to destroy network for sandbox \"bddac78856f4120a539d33743aa45fe3d2b627d51125f955e753a68df8f87ce0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452022 containerd[2800]: time="2025-05-27T03:52:32.451994268Z" level=error msg="Failed to destroy network for sandbox \"ab0d80ac0686fe85b2a216ffa3f3c5301de19cfbe59644151cfca887d96f2d68\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452047 containerd[2800]: time="2025-05-27T03:52:32.452025748Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2x48w,Uid:cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bddac78856f4120a539d33743aa45fe3d2b627d51125f955e753a68df8f87ce0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452169 kubelet[4330]: E0527 03:52:32.452144 4330 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bddac78856f4120a539d33743aa45fe3d2b627d51125f955e753a68df8f87ce0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452205 kubelet[4330]: E0527 03:52:32.452187 4330 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bddac78856f4120a539d33743aa45fe3d2b627d51125f955e753a68df8f87ce0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2x48w" May 27 03:52:32.452230 kubelet[4330]: E0527 03:52:32.452208 4330 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bddac78856f4120a539d33743aa45fe3d2b627d51125f955e753a68df8f87ce0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2x48w" May 27 03:52:32.452260 kubelet[4330]: E0527 03:52:32.452242 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-2x48w_kube-system(cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-2x48w_kube-system(cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bddac78856f4120a539d33743aa45fe3d2b627d51125f955e753a68df8f87ce0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-2x48w" podUID="cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8" May 27 03:52:32.452337 containerd[2800]: time="2025-05-27T03:52:32.452313509Z" level=error msg="Failed to destroy network for sandbox \"1734950697b6df64ef6e8f378c122a21a951374421f3556232803e89f590be27\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452378 containerd[2800]: time="2025-05-27T03:52:32.452349069Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65f67dbd-fgvq5,Uid:3619f166-91f2-40b2-bad8-3ee4a2356c2c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab0d80ac0686fe85b2a216ffa3f3c5301de19cfbe59644151cfca887d96f2d68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452509 kubelet[4330]: E0527 03:52:32.452481 4330 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab0d80ac0686fe85b2a216ffa3f3c5301de19cfbe59644151cfca887d96f2d68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452536 kubelet[4330]: E0527 03:52:32.452526 4330 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab0d80ac0686fe85b2a216ffa3f3c5301de19cfbe59644151cfca887d96f2d68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65f67dbd-fgvq5" May 27 03:52:32.452559 kubelet[4330]: E0527 03:52:32.452543 4330 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab0d80ac0686fe85b2a216ffa3f3c5301de19cfbe59644151cfca887d96f2d68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65f67dbd-fgvq5" May 27 03:52:32.452593 kubelet[4330]: E0527 03:52:32.452576 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-65f67dbd-fgvq5_calico-system(3619f166-91f2-40b2-bad8-3ee4a2356c2c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-65f67dbd-fgvq5_calico-system(3619f166-91f2-40b2-bad8-3ee4a2356c2c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ab0d80ac0686fe85b2a216ffa3f3c5301de19cfbe59644151cfca887d96f2d68\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-65f67dbd-fgvq5" podUID="3619f166-91f2-40b2-bad8-3ee4a2356c2c" May 27 03:52:32.452631 containerd[2800]: time="2025-05-27T03:52:32.452604470Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658b59d8b7-cq2pw,Uid:073a67af-2b67-4e56-b118-5e4b8a92e30f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1734950697b6df64ef6e8f378c122a21a951374421f3556232803e89f590be27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452726 kubelet[4330]: E0527 03:52:32.452702 4330 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1734950697b6df64ef6e8f378c122a21a951374421f3556232803e89f590be27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.452755 kubelet[4330]: E0527 03:52:32.452737 4330 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1734950697b6df64ef6e8f378c122a21a951374421f3556232803e89f590be27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-658b59d8b7-cq2pw" May 27 03:52:32.452777 kubelet[4330]: E0527 03:52:32.452752 4330 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1734950697b6df64ef6e8f378c122a21a951374421f3556232803e89f590be27\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-658b59d8b7-cq2pw" May 27 03:52:32.452801 kubelet[4330]: E0527 03:52:32.452778 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-658b59d8b7-cq2pw_calico-apiserver(073a67af-2b67-4e56-b118-5e4b8a92e30f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-658b59d8b7-cq2pw_calico-apiserver(073a67af-2b67-4e56-b118-5e4b8a92e30f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1734950697b6df64ef6e8f378c122a21a951374421f3556232803e89f590be27\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-658b59d8b7-cq2pw" podUID="073a67af-2b67-4e56-b118-5e4b8a92e30f" May 27 03:52:32.458411 containerd[2800]: time="2025-05-27T03:52:32.458382644Z" level=error msg="Failed to destroy network for sandbox \"2e7cf03f64a18487779d68529b357aeaec5f2d17c5f8efa220216284f0c920e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.458781 containerd[2800]: time="2025-05-27T03:52:32.458754885Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64c9f55499-drhl6,Uid:d96e496f-757b-4edd-abf6-47a9afc6e5a2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e7cf03f64a18487779d68529b357aeaec5f2d17c5f8efa220216284f0c920e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.458851 containerd[2800]: time="2025-05-27T03:52:32.458807325Z" level=error msg="Failed to destroy network for sandbox \"cc8c4fd2d718706e5f8306bfbc6083ce1896cb3e0a4b818102c843f2770a8e8f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.458895 kubelet[4330]: E0527 03:52:32.458874 4330 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e7cf03f64a18487779d68529b357aeaec5f2d17c5f8efa220216284f0c920e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.458927 kubelet[4330]: E0527 03:52:32.458909 4330 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e7cf03f64a18487779d68529b357aeaec5f2d17c5f8efa220216284f0c920e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-64c9f55499-drhl6" May 27 03:52:32.458950 kubelet[4330]: E0527 03:52:32.458927 4330 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e7cf03f64a18487779d68529b357aeaec5f2d17c5f8efa220216284f0c920e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-64c9f55499-drhl6" May 27 03:52:32.458979 kubelet[4330]: E0527 03:52:32.458959 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-64c9f55499-drhl6_calico-system(d96e496f-757b-4edd-abf6-47a9afc6e5a2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-64c9f55499-drhl6_calico-system(d96e496f-757b-4edd-abf6-47a9afc6e5a2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e7cf03f64a18487779d68529b357aeaec5f2d17c5f8efa220216284f0c920e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-64c9f55499-drhl6" podUID="d96e496f-757b-4edd-abf6-47a9afc6e5a2" May 27 03:52:32.459134 containerd[2800]: time="2025-05-27T03:52:32.459106366Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-tt8gf,Uid:5718ef86-6991-4e86-870c-b776228460c0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc8c4fd2d718706e5f8306bfbc6083ce1896cb3e0a4b818102c843f2770a8e8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.459225 kubelet[4330]: E0527 03:52:32.459206 4330 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc8c4fd2d718706e5f8306bfbc6083ce1896cb3e0a4b818102c843f2770a8e8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:52:32.459249 kubelet[4330]: E0527 03:52:32.459240 4330 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc8c4fd2d718706e5f8306bfbc6083ce1896cb3e0a4b818102c843f2770a8e8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-tt8gf" May 27 03:52:32.459269 kubelet[4330]: E0527 03:52:32.459254 4330 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc8c4fd2d718706e5f8306bfbc6083ce1896cb3e0a4b818102c843f2770a8e8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-tt8gf" May 27 03:52:32.459299 kubelet[4330]: E0527 03:52:32.459283 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-tt8gf_calico-system(5718ef86-6991-4e86-870c-b776228460c0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-tt8gf_calico-system(5718ef86-6991-4e86-870c-b776228460c0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cc8c4fd2d718706e5f8306bfbc6083ce1896cb3e0a4b818102c843f2770a8e8f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:52:34.918411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3771948208.mount: Deactivated successfully. May 27 03:52:34.933526 containerd[2800]: time="2025-05-27T03:52:34.933486474Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:34.933714 containerd[2800]: time="2025-05-27T03:52:34.933530114Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=150465379" May 27 03:52:34.934085 containerd[2800]: time="2025-05-27T03:52:34.934067435Z" level=info msg="ImageCreate event name:\"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:34.935542 containerd[2800]: time="2025-05-27T03:52:34.935514438Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:34.936109 containerd[2800]: time="2025-05-27T03:52:34.936086160Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"150465241\" in 2.538778226s" May 27 03:52:34.936141 containerd[2800]: time="2025-05-27T03:52:34.936117000Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\"" May 27 03:52:34.941317 containerd[2800]: time="2025-05-27T03:52:34.941239931Z" level=info msg="CreateContainer within sandbox \"0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 03:52:34.945920 containerd[2800]: time="2025-05-27T03:52:34.945893181Z" level=info msg="Container 1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:34.954138 containerd[2800]: time="2025-05-27T03:52:34.954114478Z" level=info msg="CreateContainer within sandbox \"0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\"" May 27 03:52:34.954459 containerd[2800]: time="2025-05-27T03:52:34.954425079Z" level=info msg="StartContainer for \"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\"" May 27 03:52:34.955827 containerd[2800]: time="2025-05-27T03:52:34.955803722Z" level=info msg="connecting to shim 1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1" address="unix:///run/containerd/s/a0618409e2be4f44a3fa09d8e8806d7701e7b02aee1c7f9ba0943975c6be5144" protocol=ttrpc version=3 May 27 03:52:34.986788 systemd[1]: Started cri-containerd-1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1.scope - libcontainer container 1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1. May 27 03:52:35.016775 containerd[2800]: time="2025-05-27T03:52:35.016743731Z" level=info msg="StartContainer for \"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" returns successfully" May 27 03:52:35.145211 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 03:52:35.145322 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 03:52:35.304395 kubelet[4330]: I0527 03:52:35.304361 4330 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d96e496f-757b-4edd-abf6-47a9afc6e5a2-whisker-backend-key-pair\") pod \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\" (UID: \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\") " May 27 03:52:35.304395 kubelet[4330]: I0527 03:52:35.304400 4330 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d96e496f-757b-4edd-abf6-47a9afc6e5a2-whisker-ca-bundle\") pod \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\" (UID: \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\") " May 27 03:52:35.304822 kubelet[4330]: I0527 03:52:35.304420 4330 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s42v\" (UniqueName: \"kubernetes.io/projected/d96e496f-757b-4edd-abf6-47a9afc6e5a2-kube-api-access-7s42v\") pod \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\" (UID: \"d96e496f-757b-4edd-abf6-47a9afc6e5a2\") " May 27 03:52:35.304822 kubelet[4330]: I0527 03:52:35.304765 4330 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96e496f-757b-4edd-abf6-47a9afc6e5a2-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "d96e496f-757b-4edd-abf6-47a9afc6e5a2" (UID: "d96e496f-757b-4edd-abf6-47a9afc6e5a2"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 03:52:35.306742 kubelet[4330]: I0527 03:52:35.306721 4330 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d96e496f-757b-4edd-abf6-47a9afc6e5a2-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "d96e496f-757b-4edd-abf6-47a9afc6e5a2" (UID: "d96e496f-757b-4edd-abf6-47a9afc6e5a2"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 03:52:35.306793 kubelet[4330]: I0527 03:52:35.306772 4330 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96e496f-757b-4edd-abf6-47a9afc6e5a2-kube-api-access-7s42v" (OuterVolumeSpecName: "kube-api-access-7s42v") pod "d96e496f-757b-4edd-abf6-47a9afc6e5a2" (UID: "d96e496f-757b-4edd-abf6-47a9afc6e5a2"). InnerVolumeSpecName "kube-api-access-7s42v". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 03:52:35.404616 kubelet[4330]: I0527 03:52:35.404592 4330 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d96e496f-757b-4edd-abf6-47a9afc6e5a2-whisker-ca-bundle\") on node \"ci-4344.0.0-a-636136d453\" DevicePath \"\"" May 27 03:52:35.404616 kubelet[4330]: I0527 03:52:35.404610 4330 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7s42v\" (UniqueName: \"kubernetes.io/projected/d96e496f-757b-4edd-abf6-47a9afc6e5a2-kube-api-access-7s42v\") on node \"ci-4344.0.0-a-636136d453\" DevicePath \"\"" May 27 03:52:35.404616 kubelet[4330]: I0527 03:52:35.404620 4330 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d96e496f-757b-4edd-abf6-47a9afc6e5a2-whisker-backend-key-pair\") on node \"ci-4344.0.0-a-636136d453\" DevicePath \"\"" May 27 03:52:35.413920 systemd[1]: Removed slice kubepods-besteffort-podd96e496f_757b_4edd_abf6_47a9afc6e5a2.slice - libcontainer container kubepods-besteffort-podd96e496f_757b_4edd_abf6_47a9afc6e5a2.slice. May 27 03:52:35.422792 kubelet[4330]: I0527 03:52:35.422744 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-8lnmb" podStartSLOduration=1.002601089 podStartE2EDuration="7.42273083s" podCreationTimestamp="2025-05-27 03:52:28 +0000 UTC" firstStartedPulling="2025-05-27 03:52:28.51647698 +0000 UTC m=+20.249315200" lastFinishedPulling="2025-05-27 03:52:34.936606721 +0000 UTC m=+26.669444941" observedRunningTime="2025-05-27 03:52:35.422300469 +0000 UTC m=+27.155138689" watchObservedRunningTime="2025-05-27 03:52:35.42273083 +0000 UTC m=+27.155569050" May 27 03:52:35.449151 systemd[1]: Created slice kubepods-besteffort-poda3887ab7_1427_473a_8116_3289443cde48.slice - libcontainer container kubepods-besteffort-poda3887ab7_1427_473a_8116_3289443cde48.slice. May 27 03:52:35.505780 kubelet[4330]: I0527 03:52:35.505745 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss4zc\" (UniqueName: \"kubernetes.io/projected/a3887ab7-1427-473a-8116-3289443cde48-kube-api-access-ss4zc\") pod \"whisker-79d4c87959-gb96j\" (UID: \"a3887ab7-1427-473a-8116-3289443cde48\") " pod="calico-system/whisker-79d4c87959-gb96j" May 27 03:52:35.505833 kubelet[4330]: I0527 03:52:35.505821 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a3887ab7-1427-473a-8116-3289443cde48-whisker-backend-key-pair\") pod \"whisker-79d4c87959-gb96j\" (UID: \"a3887ab7-1427-473a-8116-3289443cde48\") " pod="calico-system/whisker-79d4c87959-gb96j" May 27 03:52:35.505914 kubelet[4330]: I0527 03:52:35.505893 4330 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3887ab7-1427-473a-8116-3289443cde48-whisker-ca-bundle\") pod \"whisker-79d4c87959-gb96j\" (UID: \"a3887ab7-1427-473a-8116-3289443cde48\") " pod="calico-system/whisker-79d4c87959-gb96j" May 27 03:52:35.751285 containerd[2800]: time="2025-05-27T03:52:35.751205253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79d4c87959-gb96j,Uid:a3887ab7-1427-473a-8116-3289443cde48,Namespace:calico-system,Attempt:0,}" May 27 03:52:35.869840 systemd-networkd[2575]: cali1290cb20356: Link UP May 27 03:52:35.870040 systemd-networkd[2575]: cali1290cb20356: Gained carrier May 27 03:52:35.877273 containerd[2800]: 2025-05-27 03:52:35.778 [INFO][6076] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 03:52:35.877273 containerd[2800]: 2025-05-27 03:52:35.792 [INFO][6076] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0 whisker-79d4c87959- calico-system a3887ab7-1427-473a-8116-3289443cde48 839 0 2025-05-27 03:52:35 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:79d4c87959 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344.0.0-a-636136d453 whisker-79d4c87959-gb96j eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali1290cb20356 [] [] }} ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Namespace="calico-system" Pod="whisker-79d4c87959-gb96j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-" May 27 03:52:35.877273 containerd[2800]: 2025-05-27 03:52:35.792 [INFO][6076] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Namespace="calico-system" Pod="whisker-79d4c87959-gb96j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" May 27 03:52:35.877273 containerd[2800]: 2025-05-27 03:52:35.829 [INFO][6099] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" HandleID="k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Workload="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.829 [INFO][6099] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" HandleID="k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Workload="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005a5460), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-636136d453", "pod":"whisker-79d4c87959-gb96j", "timestamp":"2025-05-27 03:52:35.829602131 +0000 UTC"}, Hostname:"ci-4344.0.0-a-636136d453", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.829 [INFO][6099] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.829 [INFO][6099] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.829 [INFO][6099] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-636136d453' May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.838 [INFO][6099] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.843 [INFO][6099] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.846 [INFO][6099] ipam/ipam.go 511: Trying affinity for 192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.847 [INFO][6099] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877403 containerd[2800]: 2025-05-27 03:52:35.850 [INFO][6099] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877565 containerd[2800]: 2025-05-27 03:52:35.850 [INFO][6099] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.0/26 handle="k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877565 containerd[2800]: 2025-05-27 03:52:35.852 [INFO][6099] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002 May 27 03:52:35.877565 containerd[2800]: 2025-05-27 03:52:35.856 [INFO][6099] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.0/26 handle="k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877565 containerd[2800]: 2025-05-27 03:52:35.861 [INFO][6099] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.1/26] block=192.168.105.0/26 handle="k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877565 containerd[2800]: 2025-05-27 03:52:35.861 [INFO][6099] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.1/26] handle="k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" host="ci-4344.0.0-a-636136d453" May 27 03:52:35.877565 containerd[2800]: 2025-05-27 03:52:35.861 [INFO][6099] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:52:35.877565 containerd[2800]: 2025-05-27 03:52:35.861 [INFO][6099] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.1/26] IPv6=[] ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" HandleID="k8s-pod-network.098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Workload="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" May 27 03:52:35.877692 containerd[2800]: 2025-05-27 03:52:35.863 [INFO][6076] cni-plugin/k8s.go 418: Populated endpoint ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Namespace="calico-system" Pod="whisker-79d4c87959-gb96j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0", GenerateName:"whisker-79d4c87959-", Namespace:"calico-system", SelfLink:"", UID:"a3887ab7-1427-473a-8116-3289443cde48", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79d4c87959", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"", Pod:"whisker-79d4c87959-gb96j", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.105.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1290cb20356", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:35.877692 containerd[2800]: 2025-05-27 03:52:35.863 [INFO][6076] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.1/32] ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Namespace="calico-system" Pod="whisker-79d4c87959-gb96j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" May 27 03:52:35.877760 containerd[2800]: 2025-05-27 03:52:35.863 [INFO][6076] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1290cb20356 ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Namespace="calico-system" Pod="whisker-79d4c87959-gb96j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" May 27 03:52:35.877760 containerd[2800]: 2025-05-27 03:52:35.870 [INFO][6076] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Namespace="calico-system" Pod="whisker-79d4c87959-gb96j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" May 27 03:52:35.877793 containerd[2800]: 2025-05-27 03:52:35.870 [INFO][6076] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Namespace="calico-system" Pod="whisker-79d4c87959-gb96j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0", GenerateName:"whisker-79d4c87959-", Namespace:"calico-system", SelfLink:"", UID:"a3887ab7-1427-473a-8116-3289443cde48", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79d4c87959", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002", Pod:"whisker-79d4c87959-gb96j", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.105.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1290cb20356", MAC:"1e:53:38:4e:99:ef", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:35.877836 containerd[2800]: 2025-05-27 03:52:35.875 [INFO][6076] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" Namespace="calico-system" Pod="whisker-79d4c87959-gb96j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-whisker--79d4c87959--gb96j-eth0" May 27 03:52:35.887290 containerd[2800]: time="2025-05-27T03:52:35.887262247Z" level=info msg="connecting to shim 098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002" address="unix:///run/containerd/s/2ebb9044adf9f005ddd2d36bfd39b93f4860ba6ccb0306348fed8d65e62d4d0a" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:35.923776 systemd[1]: Started cri-containerd-098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002.scope - libcontainer container 098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002. May 27 03:52:35.929234 systemd[1]: var-lib-kubelet-pods-d96e496f\x2d757b\x2d4edd\x2dabf6\x2d47a9afc6e5a2-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d7s42v.mount: Deactivated successfully. May 27 03:52:35.929314 systemd[1]: var-lib-kubelet-pods-d96e496f\x2d757b\x2d4edd\x2dabf6\x2d47a9afc6e5a2-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 03:52:35.950464 containerd[2800]: time="2025-05-27T03:52:35.950436295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79d4c87959-gb96j,Uid:a3887ab7-1427-473a-8116-3289443cde48,Namespace:calico-system,Attempt:0,} returns sandbox id \"098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002\"" May 27 03:52:35.951482 containerd[2800]: time="2025-05-27T03:52:35.951462657Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:52:35.972461 containerd[2800]: time="2025-05-27T03:52:35.972425699Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:52:35.972703 containerd[2800]: time="2025-05-27T03:52:35.972675739Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:52:35.972743 containerd[2800]: time="2025-05-27T03:52:35.972676419Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:52:35.972905 kubelet[4330]: E0527 03:52:35.972863 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:52:35.972965 kubelet[4330]: E0527 03:52:35.972920 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:52:35.973130 kubelet[4330]: E0527 03:52:35.973096 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b49c0486b30c4f02b12ce8a6e8271447,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:52:35.975360 containerd[2800]: time="2025-05-27T03:52:35.975334705Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:52:35.997167 containerd[2800]: time="2025-05-27T03:52:35.997127269Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:52:35.997421 containerd[2800]: time="2025-05-27T03:52:35.997394389Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:52:35.997476 containerd[2800]: time="2025-05-27T03:52:35.997452989Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:52:35.997573 kubelet[4330]: E0527 03:52:35.997540 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:52:35.997610 kubelet[4330]: E0527 03:52:35.997583 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:52:35.997780 kubelet[4330]: E0527 03:52:35.997731 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:52:35.998925 kubelet[4330]: E0527 03:52:35.998892 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:52:36.353442 kubelet[4330]: I0527 03:52:36.353405 4330 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d96e496f-757b-4edd-abf6-47a9afc6e5a2" path="/var/lib/kubelet/pods/d96e496f-757b-4edd-abf6-47a9afc6e5a2/volumes" May 27 03:52:36.410040 kubelet[4330]: I0527 03:52:36.410016 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:52:36.410989 kubelet[4330]: E0527 03:52:36.410955 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:52:37.413832 kubelet[4330]: E0527 03:52:37.413790 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:52:37.916785 systemd-networkd[2575]: cali1290cb20356: Gained IPv6LL May 27 03:52:38.325339 kubelet[4330]: I0527 03:52:38.325268 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:52:38.692410 systemd-networkd[2575]: vxlan.calico: Link UP May 27 03:52:38.692415 systemd-networkd[2575]: vxlan.calico: Gained carrier May 27 03:52:40.348763 systemd-networkd[2575]: vxlan.calico: Gained IPv6LL May 27 03:52:42.820402 kubelet[4330]: I0527 03:52:42.820313 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:52:42.888056 containerd[2800]: time="2025-05-27T03:52:42.888019812Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"69c26723243b3ffaddfd0a8c782b5fe6acd9ce193bc0794d6e584560f8650118\" pid:6776 exited_at:{seconds:1748317962 nanos:887718212}" May 27 03:52:42.975809 containerd[2800]: time="2025-05-27T03:52:42.975772445Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"eb9edadf49c7b66b2176afd09f1b8e2af2b5289c7463d447852517ea7dc5b8d4\" pid:6808 exited_at:{seconds:1748317962 nanos:975319684}" May 27 03:52:43.351178 containerd[2800]: time="2025-05-27T03:52:43.351148779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2x48w,Uid:cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8,Namespace:kube-system,Attempt:0,}" May 27 03:52:43.351274 containerd[2800]: time="2025-05-27T03:52:43.351152299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-tt8gf,Uid:5718ef86-6991-4e86-870c-b776228460c0,Namespace:calico-system,Attempt:0,}" May 27 03:52:43.446003 systemd-networkd[2575]: cali9797fa4dab7: Link UP May 27 03:52:43.446336 systemd-networkd[2575]: cali9797fa4dab7: Gained carrier May 27 03:52:43.454444 containerd[2800]: 2025-05-27 03:52:43.384 [INFO][6834] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0 goldmane-78d55f7ddc- calico-system 5718ef86-6991-4e86-870c-b776228460c0 763 0 2025-05-27 03:52:28 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344.0.0-a-636136d453 goldmane-78d55f7ddc-tt8gf eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali9797fa4dab7 [] [] }} ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Namespace="calico-system" Pod="goldmane-78d55f7ddc-tt8gf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-" May 27 03:52:43.454444 containerd[2800]: 2025-05-27 03:52:43.384 [INFO][6834] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Namespace="calico-system" Pod="goldmane-78d55f7ddc-tt8gf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" May 27 03:52:43.454444 containerd[2800]: 2025-05-27 03:52:43.404 [INFO][6885] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" HandleID="k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Workload="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.404 [INFO][6885] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" HandleID="k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Workload="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001b7720), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-636136d453", "pod":"goldmane-78d55f7ddc-tt8gf", "timestamp":"2025-05-27 03:52:43.404593963 +0000 UTC"}, Hostname:"ci-4344.0.0-a-636136d453", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.404 [INFO][6885] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.404 [INFO][6885] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.404 [INFO][6885] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-636136d453' May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.412 [INFO][6885] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.416 [INFO][6885] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.419 [INFO][6885] ipam/ipam.go 511: Trying affinity for 192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.420 [INFO][6885] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454605 containerd[2800]: 2025-05-27 03:52:43.422 [INFO][6885] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454809 containerd[2800]: 2025-05-27 03:52:43.422 [INFO][6885] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.0/26 handle="k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454809 containerd[2800]: 2025-05-27 03:52:43.423 [INFO][6885] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2 May 27 03:52:43.454809 containerd[2800]: 2025-05-27 03:52:43.425 [INFO][6885] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.0/26 handle="k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454809 containerd[2800]: 2025-05-27 03:52:43.442 [INFO][6885] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.2/26] block=192.168.105.0/26 handle="k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454809 containerd[2800]: 2025-05-27 03:52:43.442 [INFO][6885] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.2/26] handle="k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.454809 containerd[2800]: 2025-05-27 03:52:43.442 [INFO][6885] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:52:43.454809 containerd[2800]: 2025-05-27 03:52:43.442 [INFO][6885] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.2/26] IPv6=[] ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" HandleID="k8s-pod-network.a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Workload="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" May 27 03:52:43.454930 containerd[2800]: 2025-05-27 03:52:43.444 [INFO][6834] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Namespace="calico-system" Pod="goldmane-78d55f7ddc-tt8gf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"5718ef86-6991-4e86-870c-b776228460c0", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"", Pod:"goldmane-78d55f7ddc-tt8gf", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.105.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali9797fa4dab7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:43.454930 containerd[2800]: 2025-05-27 03:52:43.444 [INFO][6834] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.2/32] ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Namespace="calico-system" Pod="goldmane-78d55f7ddc-tt8gf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" May 27 03:52:43.454996 containerd[2800]: 2025-05-27 03:52:43.444 [INFO][6834] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9797fa4dab7 ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Namespace="calico-system" Pod="goldmane-78d55f7ddc-tt8gf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" May 27 03:52:43.454996 containerd[2800]: 2025-05-27 03:52:43.446 [INFO][6834] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Namespace="calico-system" Pod="goldmane-78d55f7ddc-tt8gf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" May 27 03:52:43.455039 containerd[2800]: 2025-05-27 03:52:43.446 [INFO][6834] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Namespace="calico-system" Pod="goldmane-78d55f7ddc-tt8gf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"5718ef86-6991-4e86-870c-b776228460c0", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2", Pod:"goldmane-78d55f7ddc-tt8gf", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.105.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali9797fa4dab7", MAC:"c6:d9:a3:d8:e7:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:43.455084 containerd[2800]: 2025-05-27 03:52:43.453 [INFO][6834] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" Namespace="calico-system" Pod="goldmane-78d55f7ddc-tt8gf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-goldmane--78d55f7ddc--tt8gf-eth0" May 27 03:52:43.464730 containerd[2800]: time="2025-05-27T03:52:43.464693875Z" level=info msg="connecting to shim a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2" address="unix:///run/containerd/s/4ef7197d527a8ba37d7710700cfff0c0b4154efeb0b5bd4694a151bcef2d21d4" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:43.493773 systemd[1]: Started cri-containerd-a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2.scope - libcontainer container a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2. May 27 03:52:43.519896 containerd[2800]: time="2025-05-27T03:52:43.519867702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-tt8gf,Uid:5718ef86-6991-4e86-870c-b776228460c0,Namespace:calico-system,Attempt:0,} returns sandbox id \"a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2\"" May 27 03:52:43.521724 containerd[2800]: time="2025-05-27T03:52:43.521690624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:52:43.536265 systemd-networkd[2575]: calibde38c8d43a: Link UP May 27 03:52:43.536592 systemd-networkd[2575]: calibde38c8d43a: Gained carrier May 27 03:52:43.543663 containerd[2800]: time="2025-05-27T03:52:43.543638010Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:52:43.544359 containerd[2800]: 2025-05-27 03:52:43.384 [INFO][6833] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0 coredns-668d6bf9bc- kube-system cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8 765 0 2025-05-27 03:52:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-a-636136d453 coredns-668d6bf9bc-2x48w eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibde38c8d43a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Namespace="kube-system" Pod="coredns-668d6bf9bc-2x48w" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-" May 27 03:52:43.544359 containerd[2800]: 2025-05-27 03:52:43.384 [INFO][6833] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Namespace="kube-system" Pod="coredns-668d6bf9bc-2x48w" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" May 27 03:52:43.544359 containerd[2800]: 2025-05-27 03:52:43.404 [INFO][6887] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" HandleID="k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Workload="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.404 [INFO][6887] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" HandleID="k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Workload="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003c15d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-a-636136d453", "pod":"coredns-668d6bf9bc-2x48w", "timestamp":"2025-05-27 03:52:43.404602603 +0000 UTC"}, Hostname:"ci-4344.0.0-a-636136d453", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.404 [INFO][6887] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.442 [INFO][6887] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.443 [INFO][6887] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-636136d453' May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.513 [INFO][6887] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.516 [INFO][6887] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.519 [INFO][6887] ipam/ipam.go 511: Trying affinity for 192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.521 [INFO][6887] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544465 containerd[2800]: 2025-05-27 03:52:43.523 [INFO][6887] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544629 containerd[2800]: 2025-05-27 03:52:43.523 [INFO][6887] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.0/26 handle="k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544629 containerd[2800]: 2025-05-27 03:52:43.524 [INFO][6887] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d May 27 03:52:43.544629 containerd[2800]: 2025-05-27 03:52:43.529 [INFO][6887] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.0/26 handle="k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544629 containerd[2800]: 2025-05-27 03:52:43.532 [INFO][6887] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.3/26] block=192.168.105.0/26 handle="k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544629 containerd[2800]: 2025-05-27 03:52:43.533 [INFO][6887] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.3/26] handle="k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" host="ci-4344.0.0-a-636136d453" May 27 03:52:43.544629 containerd[2800]: 2025-05-27 03:52:43.533 [INFO][6887] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:52:43.544629 containerd[2800]: 2025-05-27 03:52:43.533 [INFO][6887] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.3/26] IPv6=[] ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" HandleID="k8s-pod-network.7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Workload="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" May 27 03:52:43.544767 containerd[2800]: 2025-05-27 03:52:43.534 [INFO][6833] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Namespace="kube-system" Pod="coredns-668d6bf9bc-2x48w" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8", ResourceVersion:"765", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"", Pod:"coredns-668d6bf9bc-2x48w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibde38c8d43a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:43.544767 containerd[2800]: 2025-05-27 03:52:43.534 [INFO][6833] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.3/32] ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Namespace="kube-system" Pod="coredns-668d6bf9bc-2x48w" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" May 27 03:52:43.544767 containerd[2800]: 2025-05-27 03:52:43.534 [INFO][6833] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibde38c8d43a ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Namespace="kube-system" Pod="coredns-668d6bf9bc-2x48w" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" May 27 03:52:43.544767 containerd[2800]: 2025-05-27 03:52:43.536 [INFO][6833] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Namespace="kube-system" Pod="coredns-668d6bf9bc-2x48w" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" May 27 03:52:43.544767 containerd[2800]: 2025-05-27 03:52:43.537 [INFO][6833] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Namespace="kube-system" Pod="coredns-668d6bf9bc-2x48w" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8", ResourceVersion:"765", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d", Pod:"coredns-668d6bf9bc-2x48w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibde38c8d43a", MAC:"1a:13:33:43:89:00", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:43.544767 containerd[2800]: 2025-05-27 03:52:43.542 [INFO][6833] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" Namespace="kube-system" Pod="coredns-668d6bf9bc-2x48w" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--2x48w-eth0" May 27 03:52:43.556684 containerd[2800]: time="2025-05-27T03:52:43.556639106Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:52:43.556780 containerd[2800]: time="2025-05-27T03:52:43.556681506Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:52:43.556887 kubelet[4330]: E0527 03:52:43.556846 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:52:43.556939 kubelet[4330]: E0527 03:52:43.556895 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:52:43.557068 kubelet[4330]: E0527 03:52:43.557022 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5z2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-tt8gf_calico-system(5718ef86-6991-4e86-870c-b776228460c0): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:52:43.558210 kubelet[4330]: E0527 03:52:43.558173 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:52:43.564877 containerd[2800]: time="2025-05-27T03:52:43.564845476Z" level=info msg="connecting to shim 7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d" address="unix:///run/containerd/s/443a6059b0476d6c3bdbcfd0038c2da4cb9f6d9d5312494aa6377be9c00e8ab4" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:43.597850 systemd[1]: Started cri-containerd-7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d.scope - libcontainer container 7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d. May 27 03:52:43.625367 containerd[2800]: time="2025-05-27T03:52:43.625331789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2x48w,Uid:cb4b90f2-6dd3-4625-bfc6-cf0978ad54c8,Namespace:kube-system,Attempt:0,} returns sandbox id \"7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d\"" May 27 03:52:43.627161 containerd[2800]: time="2025-05-27T03:52:43.627135831Z" level=info msg="CreateContainer within sandbox \"7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 03:52:43.631763 containerd[2800]: time="2025-05-27T03:52:43.631732196Z" level=info msg="Container ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:43.634336 containerd[2800]: time="2025-05-27T03:52:43.634311839Z" level=info msg="CreateContainer within sandbox \"7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0\"" May 27 03:52:43.634598 containerd[2800]: time="2025-05-27T03:52:43.634575000Z" level=info msg="StartContainer for \"ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0\"" May 27 03:52:43.635319 containerd[2800]: time="2025-05-27T03:52:43.635296441Z" level=info msg="connecting to shim ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0" address="unix:///run/containerd/s/443a6059b0476d6c3bdbcfd0038c2da4cb9f6d9d5312494aa6377be9c00e8ab4" protocol=ttrpc version=3 May 27 03:52:43.665869 systemd[1]: Started cri-containerd-ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0.scope - libcontainer container ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0. May 27 03:52:43.703003 containerd[2800]: time="2025-05-27T03:52:43.702976762Z" level=info msg="StartContainer for \"ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0\" returns successfully" May 27 03:52:44.351072 containerd[2800]: time="2025-05-27T03:52:44.351021796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658b59d8b7-cq2pw,Uid:073a67af-2b67-4e56-b118-5e4b8a92e30f,Namespace:calico-apiserver,Attempt:0,}" May 27 03:52:44.351483 containerd[2800]: time="2025-05-27T03:52:44.351023076Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658b59d8b7-5zqdk,Uid:e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09,Namespace:calico-apiserver,Attempt:0,}" May 27 03:52:44.423303 kubelet[4330]: E0527 03:52:44.423251 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:52:44.430443 systemd-networkd[2575]: cali7a752744fd2: Link UP May 27 03:52:44.431622 systemd-networkd[2575]: cali7a752744fd2: Gained carrier May 27 03:52:44.438307 kubelet[4330]: I0527 03:52:44.438257 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-2x48w" podStartSLOduration=30.438243374 podStartE2EDuration="30.438243374s" podCreationTimestamp="2025-05-27 03:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:52:44.437929094 +0000 UTC m=+36.170767314" watchObservedRunningTime="2025-05-27 03:52:44.438243374 +0000 UTC m=+36.171081594" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.382 [INFO][7094] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0 calico-apiserver-658b59d8b7- calico-apiserver e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09 762 0 2025-05-27 03:52:25 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:658b59d8b7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-a-636136d453 calico-apiserver-658b59d8b7-5zqdk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7a752744fd2 [] [] }} ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-5zqdk" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.382 [INFO][7094] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-5zqdk" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.402 [INFO][7144] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" HandleID="k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Workload="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.403 [INFO][7144] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" HandleID="k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Workload="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d8f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-a-636136d453", "pod":"calico-apiserver-658b59d8b7-5zqdk", "timestamp":"2025-05-27 03:52:44.402870014 +0000 UTC"}, Hostname:"ci-4344.0.0-a-636136d453", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.403 [INFO][7144] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.403 [INFO][7144] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.403 [INFO][7144] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-636136d453' May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.410 [INFO][7144] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.413 [INFO][7144] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.416 [INFO][7144] ipam/ipam.go 511: Trying affinity for 192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.418 [INFO][7144] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.420 [INFO][7144] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.420 [INFO][7144] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.0/26 handle="k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.421 [INFO][7144] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90 May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.423 [INFO][7144] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.0/26 handle="k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.427 [INFO][7144] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.4/26] block=192.168.105.0/26 handle="k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.427 [INFO][7144] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.4/26] handle="k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.427 [INFO][7144] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:52:44.442167 containerd[2800]: 2025-05-27 03:52:44.427 [INFO][7144] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.4/26] IPv6=[] ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" HandleID="k8s-pod-network.8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Workload="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" May 27 03:52:44.442591 containerd[2800]: 2025-05-27 03:52:44.428 [INFO][7094] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-5zqdk" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0", GenerateName:"calico-apiserver-658b59d8b7-", Namespace:"calico-apiserver", SelfLink:"", UID:"e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09", ResourceVersion:"762", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658b59d8b7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"", Pod:"calico-apiserver-658b59d8b7-5zqdk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7a752744fd2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:44.442591 containerd[2800]: 2025-05-27 03:52:44.428 [INFO][7094] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.4/32] ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-5zqdk" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" May 27 03:52:44.442591 containerd[2800]: 2025-05-27 03:52:44.428 [INFO][7094] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7a752744fd2 ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-5zqdk" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" May 27 03:52:44.442591 containerd[2800]: 2025-05-27 03:52:44.433 [INFO][7094] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-5zqdk" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" May 27 03:52:44.442591 containerd[2800]: 2025-05-27 03:52:44.435 [INFO][7094] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-5zqdk" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0", GenerateName:"calico-apiserver-658b59d8b7-", Namespace:"calico-apiserver", SelfLink:"", UID:"e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09", ResourceVersion:"762", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658b59d8b7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90", Pod:"calico-apiserver-658b59d8b7-5zqdk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7a752744fd2", MAC:"56:11:92:55:dd:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:44.442591 containerd[2800]: 2025-05-27 03:52:44.440 [INFO][7094] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-5zqdk" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--5zqdk-eth0" May 27 03:52:44.452531 containerd[2800]: time="2025-05-27T03:52:44.452444470Z" level=info msg="connecting to shim 8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90" address="unix:///run/containerd/s/a3000f8d7df56b50931654d140c352589c4318a9a9953b8585a29b05b85db74d" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:44.483787 systemd[1]: Started cri-containerd-8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90.scope - libcontainer container 8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90. May 27 03:52:44.509623 containerd[2800]: time="2025-05-27T03:52:44.509590135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658b59d8b7-5zqdk,Uid:e64f219b-1dd1-4fcf-9ea3-0aee89fc3d09,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90\"" May 27 03:52:44.510698 containerd[2800]: time="2025-05-27T03:52:44.510675296Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 03:52:44.533429 systemd-networkd[2575]: calia91ebfef851: Link UP May 27 03:52:44.533694 systemd-networkd[2575]: calia91ebfef851: Gained carrier May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.382 [INFO][7089] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0 calico-apiserver-658b59d8b7- calico-apiserver 073a67af-2b67-4e56-b118-5e4b8a92e30f 764 0 2025-05-27 03:52:25 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:658b59d8b7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-a-636136d453 calico-apiserver-658b59d8b7-cq2pw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia91ebfef851 [] [] }} ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-cq2pw" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.382 [INFO][7089] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-cq2pw" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.402 [INFO][7149] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" HandleID="k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Workload="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.403 [INFO][7149] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" HandleID="k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Workload="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003c0170), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-a-636136d453", "pod":"calico-apiserver-658b59d8b7-cq2pw", "timestamp":"2025-05-27 03:52:44.402967655 +0000 UTC"}, Hostname:"ci-4344.0.0-a-636136d453", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.403 [INFO][7149] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.427 [INFO][7149] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.427 [INFO][7149] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-636136d453' May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.511 [INFO][7149] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.514 [INFO][7149] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.518 [INFO][7149] ipam/ipam.go 511: Trying affinity for 192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.520 [INFO][7149] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.522 [INFO][7149] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.522 [INFO][7149] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.0/26 handle="k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.523 [INFO][7149] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860 May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.526 [INFO][7149] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.0/26 handle="k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.530 [INFO][7149] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.5/26] block=192.168.105.0/26 handle="k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.530 [INFO][7149] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.5/26] handle="k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" host="ci-4344.0.0-a-636136d453" May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.530 [INFO][7149] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:52:44.541407 containerd[2800]: 2025-05-27 03:52:44.530 [INFO][7149] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.5/26] IPv6=[] ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" HandleID="k8s-pod-network.5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Workload="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" May 27 03:52:44.541904 containerd[2800]: 2025-05-27 03:52:44.532 [INFO][7089] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-cq2pw" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0", GenerateName:"calico-apiserver-658b59d8b7-", Namespace:"calico-apiserver", SelfLink:"", UID:"073a67af-2b67-4e56-b118-5e4b8a92e30f", ResourceVersion:"764", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658b59d8b7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"", Pod:"calico-apiserver-658b59d8b7-cq2pw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia91ebfef851", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:44.541904 containerd[2800]: 2025-05-27 03:52:44.532 [INFO][7089] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.5/32] ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-cq2pw" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" May 27 03:52:44.541904 containerd[2800]: 2025-05-27 03:52:44.532 [INFO][7089] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia91ebfef851 ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-cq2pw" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" May 27 03:52:44.541904 containerd[2800]: 2025-05-27 03:52:44.533 [INFO][7089] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-cq2pw" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" May 27 03:52:44.541904 containerd[2800]: 2025-05-27 03:52:44.533 [INFO][7089] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-cq2pw" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0", GenerateName:"calico-apiserver-658b59d8b7-", Namespace:"calico-apiserver", SelfLink:"", UID:"073a67af-2b67-4e56-b118-5e4b8a92e30f", ResourceVersion:"764", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"658b59d8b7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860", Pod:"calico-apiserver-658b59d8b7-cq2pw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.105.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia91ebfef851", MAC:"aa:07:75:37:2e:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:44.541904 containerd[2800]: 2025-05-27 03:52:44.539 [INFO][7089] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" Namespace="calico-apiserver" Pod="calico-apiserver-658b59d8b7-cq2pw" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--apiserver--658b59d8b7--cq2pw-eth0" May 27 03:52:44.551425 containerd[2800]: time="2025-05-27T03:52:44.551393382Z" level=info msg="connecting to shim 5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860" address="unix:///run/containerd/s/f9353b6c8213185555a9c1620e8b1601ef3623ae78da95665f9ff12783de3565" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:44.587789 systemd[1]: Started cri-containerd-5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860.scope - libcontainer container 5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860. May 27 03:52:44.614959 containerd[2800]: time="2025-05-27T03:52:44.614886494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-658b59d8b7-cq2pw,Uid:073a67af-2b67-4e56-b118-5e4b8a92e30f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860\"" May 27 03:52:44.636763 systemd-networkd[2575]: cali9797fa4dab7: Gained IPv6LL May 27 03:52:44.956807 systemd-networkd[2575]: calibde38c8d43a: Gained IPv6LL May 27 03:52:45.426994 kubelet[4330]: E0527 03:52:45.426935 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:52:46.113841 containerd[2800]: time="2025-05-27T03:52:46.113789779Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:46.114288 containerd[2800]: time="2025-05-27T03:52:46.113795579Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=44453213" May 27 03:52:46.114501 containerd[2800]: time="2025-05-27T03:52:46.114480860Z" level=info msg="ImageCreate event name:\"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:46.116013 containerd[2800]: time="2025-05-27T03:52:46.115996581Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:46.116607 containerd[2800]: time="2025-05-27T03:52:46.116590422Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 1.605884606s" May 27 03:52:46.116661 containerd[2800]: time="2025-05-27T03:52:46.116611662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 03:52:46.117357 containerd[2800]: time="2025-05-27T03:52:46.117334063Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 03:52:46.118073 containerd[2800]: time="2025-05-27T03:52:46.118054823Z" level=info msg="CreateContainer within sandbox \"8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 03:52:46.121341 containerd[2800]: time="2025-05-27T03:52:46.121320387Z" level=info msg="Container c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:46.124523 containerd[2800]: time="2025-05-27T03:52:46.124500070Z" level=info msg="CreateContainer within sandbox \"8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0\"" May 27 03:52:46.124799 containerd[2800]: time="2025-05-27T03:52:46.124780830Z" level=info msg="StartContainer for \"c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0\"" May 27 03:52:46.125692 containerd[2800]: time="2025-05-27T03:52:46.125663431Z" level=info msg="connecting to shim c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0" address="unix:///run/containerd/s/a3000f8d7df56b50931654d140c352589c4318a9a9953b8585a29b05b85db74d" protocol=ttrpc version=3 May 27 03:52:46.152832 systemd[1]: Started cri-containerd-c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0.scope - libcontainer container c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0. May 27 03:52:46.180993 containerd[2800]: time="2025-05-27T03:52:46.180956526Z" level=info msg="StartContainer for \"c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0\" returns successfully" May 27 03:52:46.182110 containerd[2800]: time="2025-05-27T03:52:46.182085007Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:46.182141 containerd[2800]: time="2025-05-27T03:52:46.182127447Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 03:52:46.184302 containerd[2800]: time="2025-05-27T03:52:46.184272929Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 66.905866ms" May 27 03:52:46.184331 containerd[2800]: time="2025-05-27T03:52:46.184305409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 03:52:46.185722 containerd[2800]: time="2025-05-27T03:52:46.185699331Z" level=info msg="CreateContainer within sandbox \"5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 03:52:46.189092 containerd[2800]: time="2025-05-27T03:52:46.189063894Z" level=info msg="Container 877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:46.192319 containerd[2800]: time="2025-05-27T03:52:46.192294977Z" level=info msg="CreateContainer within sandbox \"5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42\"" May 27 03:52:46.192648 containerd[2800]: time="2025-05-27T03:52:46.192624577Z" level=info msg="StartContainer for \"877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42\"" May 27 03:52:46.193564 containerd[2800]: time="2025-05-27T03:52:46.193542058Z" level=info msg="connecting to shim 877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42" address="unix:///run/containerd/s/f9353b6c8213185555a9c1620e8b1601ef3623ae78da95665f9ff12783de3565" protocol=ttrpc version=3 May 27 03:52:46.223837 systemd[1]: Started cri-containerd-877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42.scope - libcontainer container 877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42. May 27 03:52:46.252001 containerd[2800]: time="2025-05-27T03:52:46.251971036Z" level=info msg="StartContainer for \"877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42\" returns successfully" May 27 03:52:46.428824 systemd-networkd[2575]: cali7a752744fd2: Gained IPv6LL May 27 03:52:46.437135 kubelet[4330]: I0527 03:52:46.437091 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-658b59d8b7-cq2pw" podStartSLOduration=19.868072025 podStartE2EDuration="21.43707502s" podCreationTimestamp="2025-05-27 03:52:25 +0000 UTC" firstStartedPulling="2025-05-27 03:52:44.615749655 +0000 UTC m=+36.348587875" lastFinishedPulling="2025-05-27 03:52:46.18475265 +0000 UTC m=+37.917590870" observedRunningTime="2025-05-27 03:52:46.4367911 +0000 UTC m=+38.169629360" watchObservedRunningTime="2025-05-27 03:52:46.43707502 +0000 UTC m=+38.169913240" May 27 03:52:46.444751 kubelet[4330]: I0527 03:52:46.444712 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-658b59d8b7-5zqdk" podStartSLOduration=19.83797782 podStartE2EDuration="21.444699587s" podCreationTimestamp="2025-05-27 03:52:25 +0000 UTC" firstStartedPulling="2025-05-27 03:52:44.510481816 +0000 UTC m=+36.243320036" lastFinishedPulling="2025-05-27 03:52:46.117203583 +0000 UTC m=+37.850041803" observedRunningTime="2025-05-27 03:52:46.444334987 +0000 UTC m=+38.177173207" watchObservedRunningTime="2025-05-27 03:52:46.444699587 +0000 UTC m=+38.177537807" May 27 03:52:46.556740 systemd-networkd[2575]: calia91ebfef851: Gained IPv6LL May 27 03:52:47.351574 containerd[2800]: time="2025-05-27T03:52:47.351537225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qm7vf,Uid:ff1e6702-11d8-474c-81da-442634bba8ed,Namespace:calico-system,Attempt:0,}" May 27 03:52:47.351897 containerd[2800]: time="2025-05-27T03:52:47.351538825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8jk2j,Uid:7b582b11-35a7-4cd7-a048-ee32ed51f7c8,Namespace:kube-system,Attempt:0,}" May 27 03:52:47.351897 containerd[2800]: time="2025-05-27T03:52:47.351540865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65f67dbd-fgvq5,Uid:3619f166-91f2-40b2-bad8-3ee4a2356c2c,Namespace:calico-system,Attempt:0,}" May 27 03:52:47.432144 kubelet[4330]: I0527 03:52:47.432110 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:52:47.432264 kubelet[4330]: I0527 03:52:47.432110 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:52:47.438361 systemd-networkd[2575]: cali335649a0333: Link UP May 27 03:52:47.438604 systemd-networkd[2575]: cali335649a0333: Gained carrier May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.381 [INFO][7457] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0 calico-kube-controllers-65f67dbd- calico-system 3619f166-91f2-40b2-bad8-3ee4a2356c2c 760 0 2025-05-27 03:52:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:65f67dbd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344.0.0-a-636136d453 calico-kube-controllers-65f67dbd-fgvq5 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali335649a0333 [] [] }} ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Namespace="calico-system" Pod="calico-kube-controllers-65f67dbd-fgvq5" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.381 [INFO][7457] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Namespace="calico-system" Pod="calico-kube-controllers-65f67dbd-fgvq5" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.401 [INFO][7528] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" HandleID="k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Workload="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.402 [INFO][7528] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" HandleID="k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Workload="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004dbb0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-636136d453", "pod":"calico-kube-controllers-65f67dbd-fgvq5", "timestamp":"2025-05-27 03:52:47.401963432 +0000 UTC"}, Hostname:"ci-4344.0.0-a-636136d453", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.402 [INFO][7528] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.402 [INFO][7528] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.402 [INFO][7528] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-636136d453' May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.416 [INFO][7528] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.420 [INFO][7528] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.424 [INFO][7528] ipam/ipam.go 511: Trying affinity for 192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.425 [INFO][7528] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.427 [INFO][7528] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.427 [INFO][7528] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.0/26 handle="k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.428 [INFO][7528] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18 May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.431 [INFO][7528] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.0/26 handle="k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.435 [INFO][7528] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.6/26] block=192.168.105.0/26 handle="k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.435 [INFO][7528] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.6/26] handle="k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.435 [INFO][7528] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:52:47.446887 containerd[2800]: 2025-05-27 03:52:47.435 [INFO][7528] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.6/26] IPv6=[] ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" HandleID="k8s-pod-network.37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Workload="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" May 27 03:52:47.447318 containerd[2800]: 2025-05-27 03:52:47.437 [INFO][7457] cni-plugin/k8s.go 418: Populated endpoint ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Namespace="calico-system" Pod="calico-kube-controllers-65f67dbd-fgvq5" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0", GenerateName:"calico-kube-controllers-65f67dbd-", Namespace:"calico-system", SelfLink:"", UID:"3619f166-91f2-40b2-bad8-3ee4a2356c2c", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"65f67dbd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"", Pod:"calico-kube-controllers-65f67dbd-fgvq5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.105.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali335649a0333", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:47.447318 containerd[2800]: 2025-05-27 03:52:47.437 [INFO][7457] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.6/32] ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Namespace="calico-system" Pod="calico-kube-controllers-65f67dbd-fgvq5" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" May 27 03:52:47.447318 containerd[2800]: 2025-05-27 03:52:47.437 [INFO][7457] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali335649a0333 ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Namespace="calico-system" Pod="calico-kube-controllers-65f67dbd-fgvq5" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" May 27 03:52:47.447318 containerd[2800]: 2025-05-27 03:52:47.438 [INFO][7457] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Namespace="calico-system" Pod="calico-kube-controllers-65f67dbd-fgvq5" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" May 27 03:52:47.447318 containerd[2800]: 2025-05-27 03:52:47.439 [INFO][7457] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Namespace="calico-system" Pod="calico-kube-controllers-65f67dbd-fgvq5" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0", GenerateName:"calico-kube-controllers-65f67dbd-", Namespace:"calico-system", SelfLink:"", UID:"3619f166-91f2-40b2-bad8-3ee4a2356c2c", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"65f67dbd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18", Pod:"calico-kube-controllers-65f67dbd-fgvq5", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.105.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali335649a0333", MAC:"4e:48:af:db:41:44", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:47.447318 containerd[2800]: 2025-05-27 03:52:47.445 [INFO][7457] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" Namespace="calico-system" Pod="calico-kube-controllers-65f67dbd-fgvq5" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-calico--kube--controllers--65f67dbd--fgvq5-eth0" May 27 03:52:47.457225 containerd[2800]: time="2025-05-27T03:52:47.457187163Z" level=info msg="connecting to shim 37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18" address="unix:///run/containerd/s/885f3534363e0eacacb64d0b2a6815730960279419deef934ced18847dd92c38" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:47.489900 systemd[1]: Started cri-containerd-37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18.scope - libcontainer container 37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18. May 27 03:52:47.516683 containerd[2800]: time="2025-05-27T03:52:47.516641099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65f67dbd-fgvq5,Uid:3619f166-91f2-40b2-bad8-3ee4a2356c2c,Namespace:calico-system,Attempt:0,} returns sandbox id \"37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18\"" May 27 03:52:47.517808 containerd[2800]: time="2025-05-27T03:52:47.517762100Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 03:52:47.551715 systemd-networkd[2575]: cali843973c3fb7: Link UP May 27 03:52:47.552869 systemd-networkd[2575]: cali843973c3fb7: Gained carrier May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.381 [INFO][7452] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0 csi-node-driver- calico-system ff1e6702-11d8-474c-81da-442634bba8ed 682 0 2025-05-27 03:52:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344.0.0-a-636136d453 csi-node-driver-qm7vf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali843973c3fb7 [] [] }} ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Namespace="calico-system" Pod="csi-node-driver-qm7vf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.381 [INFO][7452] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Namespace="calico-system" Pod="csi-node-driver-qm7vf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.403 [INFO][7527] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" HandleID="k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Workload="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.403 [INFO][7527] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" HandleID="k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Workload="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004de20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-636136d453", "pod":"csi-node-driver-qm7vf", "timestamp":"2025-05-27 03:52:47.402994513 +0000 UTC"}, Hostname:"ci-4344.0.0-a-636136d453", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.403 [INFO][7527] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.435 [INFO][7527] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.435 [INFO][7527] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-636136d453' May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.517 [INFO][7527] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.521 [INFO][7527] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.524 [INFO][7527] ipam/ipam.go 511: Trying affinity for 192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.525 [INFO][7527] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.527 [INFO][7527] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.527 [INFO][7527] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.0/26 handle="k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.528 [INFO][7527] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755 May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.544 [INFO][7527] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.0/26 handle="k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.548 [INFO][7527] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.7/26] block=192.168.105.0/26 handle="k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.548 [INFO][7527] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.7/26] handle="k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.548 [INFO][7527] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:52:47.561512 containerd[2800]: 2025-05-27 03:52:47.548 [INFO][7527] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.7/26] IPv6=[] ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" HandleID="k8s-pod-network.1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Workload="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" May 27 03:52:47.561963 containerd[2800]: 2025-05-27 03:52:47.550 [INFO][7452] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Namespace="calico-system" Pod="csi-node-driver-qm7vf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ff1e6702-11d8-474c-81da-442634bba8ed", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"", Pod:"csi-node-driver-qm7vf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.105.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali843973c3fb7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:47.561963 containerd[2800]: 2025-05-27 03:52:47.550 [INFO][7452] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.7/32] ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Namespace="calico-system" Pod="csi-node-driver-qm7vf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" May 27 03:52:47.561963 containerd[2800]: 2025-05-27 03:52:47.550 [INFO][7452] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali843973c3fb7 ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Namespace="calico-system" Pod="csi-node-driver-qm7vf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" May 27 03:52:47.561963 containerd[2800]: 2025-05-27 03:52:47.553 [INFO][7452] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Namespace="calico-system" Pod="csi-node-driver-qm7vf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" May 27 03:52:47.561963 containerd[2800]: 2025-05-27 03:52:47.553 [INFO][7452] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Namespace="calico-system" Pod="csi-node-driver-qm7vf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ff1e6702-11d8-474c-81da-442634bba8ed", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755", Pod:"csi-node-driver-qm7vf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.105.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali843973c3fb7", MAC:"26:9f:2b:8b:0e:fe", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:47.561963 containerd[2800]: 2025-05-27 03:52:47.560 [INFO][7452] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" Namespace="calico-system" Pod="csi-node-driver-qm7vf" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-csi--node--driver--qm7vf-eth0" May 27 03:52:47.570878 containerd[2800]: time="2025-05-27T03:52:47.570850829Z" level=info msg="connecting to shim 1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755" address="unix:///run/containerd/s/f218c7c55304f548a098edaeaee673c7302c482cc2c3e903fff55e71ff8ddab3" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:47.604792 systemd[1]: Started cri-containerd-1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755.scope - libcontainer container 1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755. May 27 03:52:47.623145 containerd[2800]: time="2025-05-27T03:52:47.623113758Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qm7vf,Uid:ff1e6702-11d8-474c-81da-442634bba8ed,Namespace:calico-system,Attempt:0,} returns sandbox id \"1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755\"" May 27 03:52:47.640414 systemd-networkd[2575]: cali117e624515f: Link UP May 27 03:52:47.640747 systemd-networkd[2575]: cali117e624515f: Gained carrier May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.381 [INFO][7461] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0 coredns-668d6bf9bc- kube-system 7b582b11-35a7-4cd7-a048-ee32ed51f7c8 756 0 2025-05-27 03:52:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-a-636136d453 coredns-668d6bf9bc-8jk2j eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali117e624515f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Namespace="kube-system" Pod="coredns-668d6bf9bc-8jk2j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.381 [INFO][7461] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Namespace="kube-system" Pod="coredns-668d6bf9bc-8jk2j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.402 [INFO][7529] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" HandleID="k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Workload="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.403 [INFO][7529] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" HandleID="k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Workload="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003de1e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-a-636136d453", "pod":"coredns-668d6bf9bc-8jk2j", "timestamp":"2025-05-27 03:52:47.402947793 +0000 UTC"}, Hostname:"ci-4344.0.0-a-636136d453", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.403 [INFO][7529] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.548 [INFO][7529] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.549 [INFO][7529] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-636136d453' May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.618 [INFO][7529] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.622 [INFO][7529] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.625 [INFO][7529] ipam/ipam.go 511: Trying affinity for 192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.626 [INFO][7529] ipam/ipam.go 158: Attempting to load block cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.628 [INFO][7529] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.105.0/26 host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.628 [INFO][7529] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.105.0/26 handle="k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.630 [INFO][7529] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.632 [INFO][7529] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.105.0/26 handle="k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.637 [INFO][7529] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.105.8/26] block=192.168.105.0/26 handle="k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.637 [INFO][7529] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.105.8/26] handle="k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" host="ci-4344.0.0-a-636136d453" May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.637 [INFO][7529] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:52:47.647855 containerd[2800]: 2025-05-27 03:52:47.637 [INFO][7529] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.105.8/26] IPv6=[] ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" HandleID="k8s-pod-network.4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Workload="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" May 27 03:52:47.648318 containerd[2800]: 2025-05-27 03:52:47.639 [INFO][7461] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Namespace="kube-system" Pod="coredns-668d6bf9bc-8jk2j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7b582b11-35a7-4cd7-a048-ee32ed51f7c8", ResourceVersion:"756", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"", Pod:"coredns-668d6bf9bc-8jk2j", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali117e624515f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:47.648318 containerd[2800]: 2025-05-27 03:52:47.639 [INFO][7461] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.105.8/32] ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Namespace="kube-system" Pod="coredns-668d6bf9bc-8jk2j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" May 27 03:52:47.648318 containerd[2800]: 2025-05-27 03:52:47.639 [INFO][7461] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali117e624515f ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Namespace="kube-system" Pod="coredns-668d6bf9bc-8jk2j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" May 27 03:52:47.648318 containerd[2800]: 2025-05-27 03:52:47.640 [INFO][7461] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Namespace="kube-system" Pod="coredns-668d6bf9bc-8jk2j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" May 27 03:52:47.648318 containerd[2800]: 2025-05-27 03:52:47.641 [INFO][7461] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Namespace="kube-system" Pod="coredns-668d6bf9bc-8jk2j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7b582b11-35a7-4cd7-a048-ee32ed51f7c8", ResourceVersion:"756", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 52, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-636136d453", ContainerID:"4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb", Pod:"coredns-668d6bf9bc-8jk2j", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.105.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali117e624515f", MAC:"46:d2:9d:7c:a5:cd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:52:47.648318 containerd[2800]: 2025-05-27 03:52:47.646 [INFO][7461] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" Namespace="kube-system" Pod="coredns-668d6bf9bc-8jk2j" WorkloadEndpoint="ci--4344.0.0--a--636136d453-k8s-coredns--668d6bf9bc--8jk2j-eth0" May 27 03:52:47.657804 containerd[2800]: time="2025-05-27T03:52:47.657770990Z" level=info msg="connecting to shim 4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb" address="unix:///run/containerd/s/356239e54bde00cdd985e3f5f75e38a8b651333c538dda657699afb4a3b59083" namespace=k8s.io protocol=ttrpc version=3 May 27 03:52:47.687801 systemd[1]: Started cri-containerd-4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb.scope - libcontainer container 4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb. May 27 03:52:47.714164 containerd[2800]: time="2025-05-27T03:52:47.714135762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8jk2j,Uid:7b582b11-35a7-4cd7-a048-ee32ed51f7c8,Namespace:kube-system,Attempt:0,} returns sandbox id \"4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb\"" May 27 03:52:47.716032 containerd[2800]: time="2025-05-27T03:52:47.716007404Z" level=info msg="CreateContainer within sandbox \"4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 03:52:47.720278 containerd[2800]: time="2025-05-27T03:52:47.720254248Z" level=info msg="Container a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:47.722749 containerd[2800]: time="2025-05-27T03:52:47.722720570Z" level=info msg="CreateContainer within sandbox \"4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59\"" May 27 03:52:47.723058 containerd[2800]: time="2025-05-27T03:52:47.723041451Z" level=info msg="StartContainer for \"a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59\"" May 27 03:52:47.723788 containerd[2800]: time="2025-05-27T03:52:47.723768331Z" level=info msg="connecting to shim a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59" address="unix:///run/containerd/s/356239e54bde00cdd985e3f5f75e38a8b651333c538dda657699afb4a3b59083" protocol=ttrpc version=3 May 27 03:52:47.753814 systemd[1]: Started cri-containerd-a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59.scope - libcontainer container a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59. May 27 03:52:47.774733 containerd[2800]: time="2025-05-27T03:52:47.774709179Z" level=info msg="StartContainer for \"a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59\" returns successfully" May 27 03:52:48.444273 kubelet[4330]: I0527 03:52:48.443995 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-8jk2j" podStartSLOduration=34.443978895 podStartE2EDuration="34.443978895s" podCreationTimestamp="2025-05-27 03:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:52:48.443446295 +0000 UTC m=+40.176284515" watchObservedRunningTime="2025-05-27 03:52:48.443978895 +0000 UTC m=+40.176817115" May 27 03:52:48.609988 containerd[2800]: time="2025-05-27T03:52:48.609941960Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:48.610377 containerd[2800]: time="2025-05-27T03:52:48.609964240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=48045219" May 27 03:52:48.610652 containerd[2800]: time="2025-05-27T03:52:48.610633321Z" level=info msg="ImageCreate event name:\"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:48.612119 containerd[2800]: time="2025-05-27T03:52:48.612099482Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:48.612744 containerd[2800]: time="2025-05-27T03:52:48.612716762Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"49414428\" in 1.094928382s" May 27 03:52:48.612793 containerd[2800]: time="2025-05-27T03:52:48.612748602Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\"" May 27 03:52:48.613515 containerd[2800]: time="2025-05-27T03:52:48.613480803Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 03:52:48.618010 containerd[2800]: time="2025-05-27T03:52:48.617963807Z" level=info msg="CreateContainer within sandbox \"37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 03:52:48.623450 containerd[2800]: time="2025-05-27T03:52:48.623387772Z" level=info msg="Container 3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:48.627637 containerd[2800]: time="2025-05-27T03:52:48.627610415Z" level=info msg="CreateContainer within sandbox \"37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\"" May 27 03:52:48.628028 containerd[2800]: time="2025-05-27T03:52:48.628007336Z" level=info msg="StartContainer for \"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\"" May 27 03:52:48.628967 containerd[2800]: time="2025-05-27T03:52:48.628942137Z" level=info msg="connecting to shim 3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3" address="unix:///run/containerd/s/885f3534363e0eacacb64d0b2a6815730960279419deef934ced18847dd92c38" protocol=ttrpc version=3 May 27 03:52:48.664854 systemd[1]: Started cri-containerd-3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3.scope - libcontainer container 3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3. May 27 03:52:48.693908 containerd[2800]: time="2025-05-27T03:52:48.693878753Z" level=info msg="StartContainer for \"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" returns successfully" May 27 03:52:48.732756 systemd-networkd[2575]: cali335649a0333: Gained IPv6LL May 27 03:52:48.860855 systemd-networkd[2575]: cali117e624515f: Gained IPv6LL May 27 03:52:48.988746 systemd-networkd[2575]: cali843973c3fb7: Gained IPv6LL May 27 03:52:49.040563 containerd[2800]: time="2025-05-27T03:52:49.040493533Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:49.040699 containerd[2800]: time="2025-05-27T03:52:49.040558853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8226240" May 27 03:52:49.041096 containerd[2800]: time="2025-05-27T03:52:49.041078134Z" level=info msg="ImageCreate event name:\"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:49.042641 containerd[2800]: time="2025-05-27T03:52:49.042622415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:49.043275 containerd[2800]: time="2025-05-27T03:52:49.043253655Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"9595481\" in 429.749612ms" May 27 03:52:49.043321 containerd[2800]: time="2025-05-27T03:52:49.043280775Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\"" May 27 03:52:49.044826 containerd[2800]: time="2025-05-27T03:52:49.044804657Z" level=info msg="CreateContainer within sandbox \"1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 03:52:49.049742 containerd[2800]: time="2025-05-27T03:52:49.049715821Z" level=info msg="Container 8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:49.053730 containerd[2800]: time="2025-05-27T03:52:49.053696864Z" level=info msg="CreateContainer within sandbox \"1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500\"" May 27 03:52:49.054054 containerd[2800]: time="2025-05-27T03:52:49.054034504Z" level=info msg="StartContainer for \"8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500\"" May 27 03:52:49.055294 containerd[2800]: time="2025-05-27T03:52:49.055274665Z" level=info msg="connecting to shim 8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500" address="unix:///run/containerd/s/f218c7c55304f548a098edaeaee673c7302c482cc2c3e903fff55e71ff8ddab3" protocol=ttrpc version=3 May 27 03:52:49.083782 systemd[1]: Started cri-containerd-8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500.scope - libcontainer container 8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500. May 27 03:52:49.110937 containerd[2800]: time="2025-05-27T03:52:49.110906191Z" level=info msg="StartContainer for \"8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500\" returns successfully" May 27 03:52:49.111697 containerd[2800]: time="2025-05-27T03:52:49.111681071Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 03:52:49.451054 kubelet[4330]: I0527 03:52:49.450995 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-65f67dbd-fgvq5" podStartSLOduration=20.355167446 podStartE2EDuration="21.450979509s" podCreationTimestamp="2025-05-27 03:52:28 +0000 UTC" firstStartedPulling="2025-05-27 03:52:47.51754254 +0000 UTC m=+39.250380720" lastFinishedPulling="2025-05-27 03:52:48.613354563 +0000 UTC m=+40.346192783" observedRunningTime="2025-05-27 03:52:49.450408148 +0000 UTC m=+41.183246368" watchObservedRunningTime="2025-05-27 03:52:49.450979509 +0000 UTC m=+41.183817689" May 27 03:52:49.476284 containerd[2800]: time="2025-05-27T03:52:49.476239729Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:49.476395 containerd[2800]: time="2025-05-27T03:52:49.476251169Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=13749925" May 27 03:52:49.476982 containerd[2800]: time="2025-05-27T03:52:49.476958850Z" level=info msg="ImageCreate event name:\"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:49.478468 containerd[2800]: time="2025-05-27T03:52:49.478439331Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:52:49.479081 containerd[2800]: time="2025-05-27T03:52:49.479054492Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"15119118\" in 367.345541ms" May 27 03:52:49.479109 containerd[2800]: time="2025-05-27T03:52:49.479085852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\"" May 27 03:52:49.479857 containerd[2800]: time="2025-05-27T03:52:49.479834412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:52:49.480703 containerd[2800]: time="2025-05-27T03:52:49.480679333Z" level=info msg="CreateContainer within sandbox \"1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 03:52:49.485028 containerd[2800]: time="2025-05-27T03:52:49.484998376Z" level=info msg="Container 817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84: CDI devices from CRI Config.CDIDevices: []" May 27 03:52:49.501397 containerd[2800]: time="2025-05-27T03:52:49.501361070Z" level=info msg="CreateContainer within sandbox \"1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84\"" May 27 03:52:49.501773 containerd[2800]: time="2025-05-27T03:52:49.501749110Z" level=info msg="StartContainer for \"817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84\"" May 27 03:52:49.503094 containerd[2800]: time="2025-05-27T03:52:49.503069991Z" level=info msg="connecting to shim 817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84" address="unix:///run/containerd/s/f218c7c55304f548a098edaeaee673c7302c482cc2c3e903fff55e71ff8ddab3" protocol=ttrpc version=3 May 27 03:52:49.503641 containerd[2800]: time="2025-05-27T03:52:49.503614952Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:52:49.503906 containerd[2800]: time="2025-05-27T03:52:49.503873592Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:52:49.503967 containerd[2800]: time="2025-05-27T03:52:49.503938152Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:52:49.504030 kubelet[4330]: E0527 03:52:49.504000 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:52:49.504094 kubelet[4330]: E0527 03:52:49.504037 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:52:49.504158 kubelet[4330]: E0527 03:52:49.504126 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b49c0486b30c4f02b12ce8a6e8271447,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:52:49.505816 containerd[2800]: time="2025-05-27T03:52:49.505789433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:52:49.533781 systemd[1]: Started cri-containerd-817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84.scope - libcontainer container 817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84. May 27 03:52:49.534807 containerd[2800]: time="2025-05-27T03:52:49.534781177Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:52:49.535016 containerd[2800]: time="2025-05-27T03:52:49.534987297Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:52:49.535069 containerd[2800]: time="2025-05-27T03:52:49.535039457Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:52:49.535138 kubelet[4330]: E0527 03:52:49.535108 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:52:49.535181 kubelet[4330]: E0527 03:52:49.535148 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:52:49.535278 kubelet[4330]: E0527 03:52:49.535246 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:52:49.536427 kubelet[4330]: E0527 03:52:49.536394 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:52:49.560832 containerd[2800]: time="2025-05-27T03:52:49.560803558Z" level=info msg="StartContainer for \"817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84\" returns successfully" May 27 03:52:50.401680 kubelet[4330]: I0527 03:52:50.401644 4330 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 03:52:50.401680 kubelet[4330]: I0527 03:52:50.401678 4330 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 03:52:50.445357 kubelet[4330]: I0527 03:52:50.445321 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:52:50.454082 kubelet[4330]: I0527 03:52:50.454037 4330 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-qm7vf" podStartSLOduration=20.598337132 podStartE2EDuration="22.454021585s" podCreationTimestamp="2025-05-27 03:52:28 +0000 UTC" firstStartedPulling="2025-05-27 03:52:47.623988679 +0000 UTC m=+39.356826899" lastFinishedPulling="2025-05-27 03:52:49.479673132 +0000 UTC m=+41.212511352" observedRunningTime="2025-05-27 03:52:50.453468785 +0000 UTC m=+42.186307005" watchObservedRunningTime="2025-05-27 03:52:50.454021585 +0000 UTC m=+42.186859805" May 27 03:52:55.942387 kubelet[4330]: I0527 03:52:55.942342 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:52:55.980292 containerd[2800]: time="2025-05-27T03:52:55.980259243Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"3ffc59a5ca7a1009097fb34d398b19aa6df1915394c602653dff57a7d0f46ec3\" pid:8004 exited_at:{seconds:1748317975 nanos:980052682}" May 27 03:52:56.017882 containerd[2800]: time="2025-05-27T03:52:56.017845583Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"f244fb1e28d576dfdb884bfbe383cf8b6a1eacfba6c710741498c428c05a8e4a\" pid:8027 exited_at:{seconds:1748317976 nanos:17675623}" May 27 03:52:57.352254 containerd[2800]: time="2025-05-27T03:52:57.352209746Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:52:57.374950 containerd[2800]: time="2025-05-27T03:52:57.374913837Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:52:57.375178 containerd[2800]: time="2025-05-27T03:52:57.375141677Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:52:57.375247 containerd[2800]: time="2025-05-27T03:52:57.375184317Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:52:57.375365 kubelet[4330]: E0527 03:52:57.375320 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:52:57.375563 kubelet[4330]: E0527 03:52:57.375373 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:52:57.375563 kubelet[4330]: E0527 03:52:57.375488 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5z2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-tt8gf_calico-system(5718ef86-6991-4e86-870c-b776228460c0): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:52:57.376663 kubelet[4330]: E0527 03:52:57.376635 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:53:02.352323 kubelet[4330]: E0527 03:53:02.352272 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:53:08.351960 kubelet[4330]: E0527 03:53:08.351852 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:53:12.982769 containerd[2800]: time="2025-05-27T03:53:12.982721915Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"929e68e6311c2d1779f5a53c1187683be1d51e0a1ef30bc25d3f302fe7485ee5\" pid:8070 exited_at:{seconds:1748317992 nanos:982453235}" May 27 03:53:13.351888 containerd[2800]: time="2025-05-27T03:53:13.351858460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:53:13.370948 containerd[2800]: time="2025-05-27T03:53:13.370902583Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:53:13.371195 containerd[2800]: time="2025-05-27T03:53:13.371165103Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:53:13.371245 containerd[2800]: time="2025-05-27T03:53:13.371217943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:53:13.371349 kubelet[4330]: E0527 03:53:13.371304 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:53:13.371580 kubelet[4330]: E0527 03:53:13.371356 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:53:13.371580 kubelet[4330]: E0527 03:53:13.371441 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b49c0486b30c4f02b12ce8a6e8271447,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:53:13.373067 containerd[2800]: time="2025-05-27T03:53:13.373049143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:53:13.397354 containerd[2800]: time="2025-05-27T03:53:13.397311348Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:53:13.397582 containerd[2800]: time="2025-05-27T03:53:13.397545268Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:53:13.397657 containerd[2800]: time="2025-05-27T03:53:13.397595748Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:53:13.397701 kubelet[4330]: E0527 03:53:13.397658 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:53:13.397701 kubelet[4330]: E0527 03:53:13.397693 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:53:13.397802 kubelet[4330]: E0527 03:53:13.397766 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:53:13.398947 kubelet[4330]: E0527 03:53:13.398912 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:53:13.616098 kubelet[4330]: I0527 03:53:13.616025 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:53:17.308678 containerd[2800]: time="2025-05-27T03:53:17.308379456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"a8abf98915071dd09db86c73f36000efb20d7c1c7c2d79a3580320d6cb5bc502\" pid:8112 exited_at:{seconds:1748317997 nanos:307935651}" May 27 03:53:22.351540 containerd[2800]: time="2025-05-27T03:53:22.351476290Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:53:22.375676 containerd[2800]: time="2025-05-27T03:53:22.374211108Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:53:22.376262 containerd[2800]: time="2025-05-27T03:53:22.376094206Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:53:22.376262 containerd[2800]: time="2025-05-27T03:53:22.376162207Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:53:22.376798 kubelet[4330]: E0527 03:53:22.376749 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:53:22.377719 kubelet[4330]: E0527 03:53:22.376797 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:53:22.377719 kubelet[4330]: E0527 03:53:22.376904 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5z2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-tt8gf_calico-system(5718ef86-6991-4e86-870c-b776228460c0): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:53:22.379684 kubelet[4330]: E0527 03:53:22.378289 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:53:26.014449 containerd[2800]: time="2025-05-27T03:53:26.014398596Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"d55b7d3de97fdc32e61921193e39d23da2aea5e5374856164aed32fe2812b91d\" pid:8144 exited_at:{seconds:1748318006 nanos:14218794}" May 27 03:53:28.352216 kubelet[4330]: E0527 03:53:28.352155 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:53:28.639774 kubelet[4330]: I0527 03:53:28.639647 4330 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:53:32.155512 systemd[1]: Started sshd@7-147.28.163.138:22-165.227.38.67:58328.service - OpenSSH per-connection server daemon (165.227.38.67:58328). May 27 03:53:32.174505 sshd[8160]: banner exchange: Connection from 165.227.38.67 port 58328: invalid format May 27 03:53:32.175469 systemd[1]: sshd@7-147.28.163.138:22-165.227.38.67:58328.service: Deactivated successfully. May 27 03:53:32.236183 systemd[1]: Started sshd@8-147.28.163.138:22-165.227.38.67:58338.service - OpenSSH per-connection server daemon (165.227.38.67:58338). May 27 03:53:32.253750 sshd[8164]: banner exchange: Connection from 165.227.38.67 port 58338: invalid format May 27 03:53:32.254682 systemd[1]: sshd@8-147.28.163.138:22-165.227.38.67:58338.service: Deactivated successfully. May 27 03:53:33.352069 kubelet[4330]: E0527 03:53:33.352023 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:53:42.352394 kubelet[4330]: E0527 03:53:42.352334 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:53:42.959410 containerd[2800]: time="2025-05-27T03:53:42.959371510Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"38f2deda2329f950431797ac145504278491e5135c8a624b3cdce3a78ee29bad\" pid:8185 exited_at:{seconds:1748318022 nanos:959185629}" May 27 03:53:46.352171 kubelet[4330]: E0527 03:53:46.352114 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:53:55.352226 containerd[2800]: time="2025-05-27T03:53:55.352152600Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:53:55.478411 containerd[2800]: time="2025-05-27T03:53:55.478372164Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:53:55.478641 containerd[2800]: time="2025-05-27T03:53:55.478615365Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:53:55.478696 containerd[2800]: time="2025-05-27T03:53:55.478662125Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:53:55.478796 kubelet[4330]: E0527 03:53:55.478759 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:53:55.479051 kubelet[4330]: E0527 03:53:55.478800 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:53:55.479051 kubelet[4330]: E0527 03:53:55.478886 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b49c0486b30c4f02b12ce8a6e8271447,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:53:55.480506 containerd[2800]: time="2025-05-27T03:53:55.480487813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:53:55.502962 containerd[2800]: time="2025-05-27T03:53:55.502921786Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:53:55.503200 containerd[2800]: time="2025-05-27T03:53:55.503171307Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:53:55.503255 containerd[2800]: time="2025-05-27T03:53:55.503173987Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:53:55.503322 kubelet[4330]: E0527 03:53:55.503293 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:53:55.503374 kubelet[4330]: E0527 03:53:55.503328 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:53:55.503441 kubelet[4330]: E0527 03:53:55.503408 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:53:55.504592 kubelet[4330]: E0527 03:53:55.504559 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:53:56.023426 containerd[2800]: time="2025-05-27T03:53:56.023393984Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"25ed7253d5d19a559d584a346c8d7b1fb62753ba299d575245e444f1c31d7688\" pid:8223 exited_at:{seconds:1748318036 nanos:23214943}" May 27 03:53:58.351987 kubelet[4330]: E0527 03:53:58.351941 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:54:01.102467 systemd[1]: Started sshd@9-147.28.163.138:22-194.0.234.20:65105.service - OpenSSH per-connection server daemon (194.0.234.20:65105). May 27 03:54:01.225328 sshd[8241]: Connection closed by 194.0.234.20 port 65105 May 27 03:54:01.226471 systemd[1]: sshd@9-147.28.163.138:22-194.0.234.20:65105.service: Deactivated successfully. May 27 03:54:06.352045 kubelet[4330]: E0527 03:54:06.351989 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:54:12.963304 containerd[2800]: time="2025-05-27T03:54:12.963267943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"70c409ca10c6f9fea5035c309c1b29de5863242d0f01123e4a941799e397c7ec\" pid:8287 exited_at:{seconds:1748318052 nanos:963076423}" May 27 03:54:13.351700 containerd[2800]: time="2025-05-27T03:54:13.351652822Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:54:13.373457 containerd[2800]: time="2025-05-27T03:54:13.373393805Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:54:13.373683 containerd[2800]: time="2025-05-27T03:54:13.373652406Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:54:13.374063 containerd[2800]: time="2025-05-27T03:54:13.373678686Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:54:13.374105 kubelet[4330]: E0527 03:54:13.373820 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:54:13.374105 kubelet[4330]: E0527 03:54:13.373865 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:54:13.374105 kubelet[4330]: E0527 03:54:13.373970 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5z2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-tt8gf_calico-system(5718ef86-6991-4e86-870c-b776228460c0): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:54:13.375130 kubelet[4330]: E0527 03:54:13.375104 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:54:17.313810 containerd[2800]: time="2025-05-27T03:54:17.313773372Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"13b4a0d46c8a8c8eb2aa8b662d93f4d54adeea781957dac2299be3465e35f60f\" pid:8324 exited_at:{seconds:1748318057 nanos:313606492}" May 27 03:54:20.352110 kubelet[4330]: E0527 03:54:20.352061 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:54:26.021692 containerd[2800]: time="2025-05-27T03:54:26.021647608Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"64ae2d36afdc2e77689f4e5e090b72115d49b0d8616f61d7dcf927f5e04e2f38\" pid:8353 exited_at:{seconds:1748318066 nanos:21439567}" May 27 03:54:26.351981 kubelet[4330]: E0527 03:54:26.351878 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:54:31.351489 kubelet[4330]: E0527 03:54:31.351424 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:54:41.351255 kubelet[4330]: E0527 03:54:41.351157 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:54:42.961036 containerd[2800]: time="2025-05-27T03:54:42.960995636Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"ebef7545019eafa0e40e63fd5072d3a2fd86bf0e592cabb7fbda141067a21aed\" pid:8399 exited_at:{seconds:1748318082 nanos:960763676}" May 27 03:54:46.354446 kubelet[4330]: E0527 03:54:46.354397 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:54:54.351958 kubelet[4330]: E0527 03:54:54.351897 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:54:56.020487 containerd[2800]: time="2025-05-27T03:54:56.020452382Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"336af5903c3aecbc37320ffebe1eb4388d3d22897589abd38647d211806c0d44\" pid:8449 exited_at:{seconds:1748318096 nanos:20273102}" May 27 03:54:59.352313 kubelet[4330]: E0527 03:54:59.352258 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:55:06.352063 kubelet[4330]: E0527 03:55:06.352012 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:55:12.961966 containerd[2800]: time="2025-05-27T03:55:12.961925034Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"346daa3ac462f3030f567ba34dbbc5e49e42415798a8a95f2b3a5d220e5dd634\" pid:8477 exited_at:{seconds:1748318112 nanos:961663434}" May 27 03:55:14.352484 kubelet[4330]: E0527 03:55:14.352438 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:55:17.303788 containerd[2800]: time="2025-05-27T03:55:17.303750558Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"d404d676a2259d3865c1832f289c83987093bcfdbe1335ae0d39b305ae6bc219\" pid:8513 exited_at:{seconds:1748318117 nanos:303561278}" May 27 03:55:17.351946 kubelet[4330]: E0527 03:55:17.351896 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:55:26.020440 containerd[2800]: time="2025-05-27T03:55:26.020393342Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"1d093985e7650240903fa561d3e34422d18cdd2c76e279e6226e324a3f9c67f9\" pid:8540 exited_at:{seconds:1748318126 nanos:20243341}" May 27 03:55:26.351856 containerd[2800]: time="2025-05-27T03:55:26.351778765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:55:26.436803 containerd[2800]: time="2025-05-27T03:55:26.436760291Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:55:26.437039 containerd[2800]: time="2025-05-27T03:55:26.437007412Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:55:26.437078 containerd[2800]: time="2025-05-27T03:55:26.437059212Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:55:26.437171 kubelet[4330]: E0527 03:55:26.437138 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:55:26.437406 kubelet[4330]: E0527 03:55:26.437180 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:55:26.437406 kubelet[4330]: E0527 03:55:26.437268 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b49c0486b30c4f02b12ce8a6e8271447,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:55:26.438860 containerd[2800]: time="2025-05-27T03:55:26.438842261Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:55:26.465307 containerd[2800]: time="2025-05-27T03:55:26.465270747Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:55:26.465557 containerd[2800]: time="2025-05-27T03:55:26.465527268Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:55:26.465601 containerd[2800]: time="2025-05-27T03:55:26.465582229Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:55:26.465723 kubelet[4330]: E0527 03:55:26.465687 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:55:26.465766 kubelet[4330]: E0527 03:55:26.465729 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:55:26.465898 kubelet[4330]: E0527 03:55:26.465843 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:55:26.467039 kubelet[4330]: E0527 03:55:26.467007 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:55:28.352573 kubelet[4330]: E0527 03:55:28.352527 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:55:40.352544 kubelet[4330]: E0527 03:55:40.352491 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:55:42.351503 containerd[2800]: time="2025-05-27T03:55:42.351438726Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:55:42.374757 containerd[2800]: time="2025-05-27T03:55:42.374658621Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:55:42.374933 containerd[2800]: time="2025-05-27T03:55:42.374905022Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:55:42.375003 containerd[2800]: time="2025-05-27T03:55:42.374954262Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:55:42.375092 kubelet[4330]: E0527 03:55:42.375051 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:55:42.375323 kubelet[4330]: E0527 03:55:42.375097 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:55:42.375323 kubelet[4330]: E0527 03:55:42.375202 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5z2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-tt8gf_calico-system(5718ef86-6991-4e86-870c-b776228460c0): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:55:42.376378 kubelet[4330]: E0527 03:55:42.376351 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:55:42.961614 containerd[2800]: time="2025-05-27T03:55:42.961579012Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"841a1e1f0b9b69f08d316d5322aff5551a6370d29eb8fdee9fc0fd00b8fc41f8\" pid:8570 exited_at:{seconds:1748318142 nanos:961289731}" May 27 03:55:55.351365 kubelet[4330]: E0527 03:55:55.351299 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:55:55.351800 kubelet[4330]: E0527 03:55:55.351633 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:55:56.020486 containerd[2800]: time="2025-05-27T03:55:56.020450015Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"c6cd076192c96c1d79a0b2ce924d9359fdac522a3ccd9223516be12a9361f43f\" pid:8634 exited_at:{seconds:1748318156 nanos:20257054}" May 27 03:56:06.352058 kubelet[4330]: E0527 03:56:06.351971 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:56:09.351653 kubelet[4330]: E0527 03:56:09.351610 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:56:12.961169 containerd[2800]: time="2025-05-27T03:56:12.961137252Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"8674343560b8a889c7a3ec41a1f1ae3d92e19bf276ce56cf9e558dc76cf2b72e\" pid:8658 exited_at:{seconds:1748318172 nanos:960866012}" May 27 03:56:17.311354 containerd[2800]: time="2025-05-27T03:56:17.311318051Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"58bfaaab7159ba585d62239cf550230f7dcb96d67a3d5e75b2a9f222ac3e50d4\" pid:8695 exited_at:{seconds:1748318177 nanos:311143371}" May 27 03:56:17.351789 kubelet[4330]: E0527 03:56:17.351745 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:56:21.351275 kubelet[4330]: E0527 03:56:21.351234 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:56:26.020549 containerd[2800]: time="2025-05-27T03:56:26.020501161Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"938b4de4d1c8a822a2e0d4f5e0d204190b7866d2f284d36adab5c734a4aec4b2\" pid:8717 exited_at:{seconds:1748318186 nanos:20331641}" May 27 03:56:29.351513 kubelet[4330]: E0527 03:56:29.351446 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:56:33.351657 kubelet[4330]: E0527 03:56:33.351603 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:56:42.951122 containerd[2800]: time="2025-05-27T03:56:42.951069641Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"ce452414c6c01d1225e1aeb38781a29757cd2a251d6017fc464cf4c2f36fe359\" pid:8753 exited_at:{seconds:1748318202 nanos:950816240}" May 27 03:56:43.352057 kubelet[4330]: E0527 03:56:43.352006 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:56:45.351762 kubelet[4330]: E0527 03:56:45.351726 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:56:54.352493 kubelet[4330]: E0527 03:56:54.352445 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:56:56.019620 containerd[2800]: time="2025-05-27T03:56:56.019576910Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"434230db3530ad1af13e72b5f806e8654a98a80164fc5be7cb843de7ac610671\" pid:8805 exited_at:{seconds:1748318216 nanos:19386070}" May 27 03:56:58.352253 kubelet[4330]: E0527 03:56:58.352216 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:57:04.463252 containerd[2800]: time="2025-05-27T03:57:04.463182984Z" level=warning msg="container event discarded" container=04a7676f4bd5214b7838e823b18694233196ad5e1b2e24ed80d01e123ffe0bf6 type=CONTAINER_CREATED_EVENT May 27 03:57:04.475470 containerd[2800]: time="2025-05-27T03:57:04.475397492Z" level=warning msg="container event discarded" container=04a7676f4bd5214b7838e823b18694233196ad5e1b2e24ed80d01e123ffe0bf6 type=CONTAINER_STARTED_EVENT May 27 03:57:04.475470 containerd[2800]: time="2025-05-27T03:57:04.475428492Z" level=warning msg="container event discarded" container=22024c10b6713d811eafc1e87913906e222c763078d06b9b01bdd6999b99d095 type=CONTAINER_CREATED_EVENT May 27 03:57:04.475470 containerd[2800]: time="2025-05-27T03:57:04.475436652Z" level=warning msg="container event discarded" container=22024c10b6713d811eafc1e87913906e222c763078d06b9b01bdd6999b99d095 type=CONTAINER_STARTED_EVENT May 27 03:57:04.475470 containerd[2800]: time="2025-05-27T03:57:04.475443332Z" level=warning msg="container event discarded" container=e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0 type=CONTAINER_CREATED_EVENT May 27 03:57:04.475470 containerd[2800]: time="2025-05-27T03:57:04.475449932Z" level=warning msg="container event discarded" container=795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa type=CONTAINER_CREATED_EVENT May 27 03:57:04.486681 containerd[2800]: time="2025-05-27T03:57:04.486612278Z" level=warning msg="container event discarded" container=e7558eecff9cb8903c40f0eded02a30a250165276107929e7cc723f8b498b486 type=CONTAINER_CREATED_EVENT May 27 03:57:04.486681 containerd[2800]: time="2025-05-27T03:57:04.486641878Z" level=warning msg="container event discarded" container=e7558eecff9cb8903c40f0eded02a30a250165276107929e7cc723f8b498b486 type=CONTAINER_STARTED_EVENT May 27 03:57:04.486681 containerd[2800]: time="2025-05-27T03:57:04.486649598Z" level=warning msg="container event discarded" container=5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606 type=CONTAINER_CREATED_EVENT May 27 03:57:04.539980 containerd[2800]: time="2025-05-27T03:57:04.539919480Z" level=warning msg="container event discarded" container=795f5d076843f6ac156ab2bf60ccc364fb5b67dccf84a8a9a52e906803fb5dfa type=CONTAINER_STARTED_EVENT May 27 03:57:04.539980 containerd[2800]: time="2025-05-27T03:57:04.539951001Z" level=warning msg="container event discarded" container=e0b8bebc6f119c4eb2fb0e4cacbf29f8dd6e031f5d6cc88991b240f7b8525fd0 type=CONTAINER_STARTED_EVENT May 27 03:57:04.539980 containerd[2800]: time="2025-05-27T03:57:04.539958881Z" level=warning msg="container event discarded" container=5b667dee4455a576cab5668a0a7720106f2e57774805c8d3aea88f0c9dfc0606 type=CONTAINER_STARTED_EVENT May 27 03:57:06.352354 kubelet[4330]: E0527 03:57:06.352306 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:57:09.352354 kubelet[4330]: E0527 03:57:09.352312 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:57:12.955299 containerd[2800]: time="2025-05-27T03:57:12.955263730Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"3ba1664bc43f05bc1d2c8f880547737c7448c96eda56d3dc3c853d033802967c\" pid:8830 exited_at:{seconds:1748318232 nanos:955008169}" May 27 03:57:15.275589 containerd[2800]: time="2025-05-27T03:57:15.275546359Z" level=warning msg="container event discarded" container=85dec5b53d418ece8ad40adde34e7fa964858ac9fbc69f29fb716033c79e779f type=CONTAINER_CREATED_EVENT May 27 03:57:15.275589 containerd[2800]: time="2025-05-27T03:57:15.275587599Z" level=warning msg="container event discarded" container=85dec5b53d418ece8ad40adde34e7fa964858ac9fbc69f29fb716033c79e779f type=CONTAINER_STARTED_EVENT May 27 03:57:15.286783 containerd[2800]: time="2025-05-27T03:57:15.286756144Z" level=warning msg="container event discarded" container=9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb type=CONTAINER_CREATED_EVENT May 27 03:57:15.323960 containerd[2800]: time="2025-05-27T03:57:15.323936465Z" level=warning msg="container event discarded" container=9d955096d6ad96d8d8139ab0f1741098f552ad432db9b1dc73ed523afb6bfabb type=CONTAINER_STARTED_EVENT May 27 03:57:15.376245 containerd[2800]: time="2025-05-27T03:57:15.376215100Z" level=warning msg="container event discarded" container=0bc5257561cfb6b4c6204291f74129ffb61dce61a5ad60c202ec4189c0a8e450 type=CONTAINER_CREATED_EVENT May 27 03:57:15.376245 containerd[2800]: time="2025-05-27T03:57:15.376233980Z" level=warning msg="container event discarded" container=0bc5257561cfb6b4c6204291f74129ffb61dce61a5ad60c202ec4189c0a8e450 type=CONTAINER_STARTED_EVENT May 27 03:57:17.298647 containerd[2800]: time="2025-05-27T03:57:17.298610096Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"017f560ecdb3b107897b3fb295929e95b5a3d2f998c9769b67c89112193344a1\" pid:8867 exited_at:{seconds:1748318237 nanos:298438215}" May 27 03:57:19.806085 containerd[2800]: time="2025-05-27T03:57:19.806028801Z" level=warning msg="container event discarded" container=438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56 type=CONTAINER_CREATED_EVENT May 27 03:57:19.854435 containerd[2800]: time="2025-05-27T03:57:19.854384665Z" level=warning msg="container event discarded" container=438b08967fe842f1a6be74c38ea11586060a5e13125ff842cc2861e304d29c56 type=CONTAINER_STARTED_EVENT May 27 03:57:20.351535 kubelet[4330]: E0527 03:57:20.351483 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:57:20.351938 kubelet[4330]: E0527 03:57:20.351796 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:57:26.020373 containerd[2800]: time="2025-05-27T03:57:26.020331743Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"183fc44d8b6ee87f46f4290f0f586bd9a7124fecc4c599e1fc1393eace074c03\" pid:8913 exited_at:{seconds:1748318246 nanos:20161903}" May 27 03:57:28.297186 containerd[2800]: time="2025-05-27T03:57:28.297145712Z" level=warning msg="container event discarded" container=fe2bbdca4e3ff70882ba0b9402dd9f7fc749de191b765d033e42e188b60d5029 type=CONTAINER_CREATED_EVENT May 27 03:57:28.297186 containerd[2800]: time="2025-05-27T03:57:28.297172073Z" level=warning msg="container event discarded" container=fe2bbdca4e3ff70882ba0b9402dd9f7fc749de191b765d033e42e188b60d5029 type=CONTAINER_STARTED_EVENT May 27 03:57:28.525591 containerd[2800]: time="2025-05-27T03:57:28.525546509Z" level=warning msg="container event discarded" container=0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0 type=CONTAINER_CREATED_EVENT May 27 03:57:28.525591 containerd[2800]: time="2025-05-27T03:57:28.525578709Z" level=warning msg="container event discarded" container=0b3f7b883cbd8c49872e602b02be05df5870f50996feab21d2f8bd8abcae82a0 type=CONTAINER_STARTED_EVENT May 27 03:57:29.229468 containerd[2800]: time="2025-05-27T03:57:29.229406695Z" level=warning msg="container event discarded" container=65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b type=CONTAINER_CREATED_EVENT May 27 03:57:29.292264 containerd[2800]: time="2025-05-27T03:57:29.292230185Z" level=warning msg="container event discarded" container=65949ad62cd007208c8d4010fa9c27fabf07c09d579625821b38561f7e7d5d5b type=CONTAINER_STARTED_EVENT May 27 03:57:29.579875 containerd[2800]: time="2025-05-27T03:57:29.579833903Z" level=warning msg="container event discarded" container=33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0 type=CONTAINER_CREATED_EVENT May 27 03:57:29.638064 containerd[2800]: time="2025-05-27T03:57:29.638027904Z" level=warning msg="container event discarded" container=33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0 type=CONTAINER_STARTED_EVENT May 27 03:57:29.962123 containerd[2800]: time="2025-05-27T03:57:29.962020897Z" level=warning msg="container event discarded" container=33394a804573512ddbcf73007c9a99d137b64c8cb18ae044dd92ab1a662470b0 type=CONTAINER_STOPPED_EVENT May 27 03:57:31.352043 kubelet[4330]: E0527 03:57:31.352009 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:57:31.598490 containerd[2800]: time="2025-05-27T03:57:31.598448722Z" level=warning msg="container event discarded" container=434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b type=CONTAINER_CREATED_EVENT May 27 03:57:31.655754 containerd[2800]: time="2025-05-27T03:57:31.655651680Z" level=warning msg="container event discarded" container=434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b type=CONTAINER_STARTED_EVENT May 27 03:57:32.177716 containerd[2800]: time="2025-05-27T03:57:32.177683596Z" level=warning msg="container event discarded" container=434cb8888ed74aa34335e3baa16c7f192f59eaaf68343bcd550f2a76a80df34b type=CONTAINER_STOPPED_EVENT May 27 03:57:32.352182 kubelet[4330]: E0527 03:57:32.352132 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:57:34.963939 containerd[2800]: time="2025-05-27T03:57:34.963854707Z" level=warning msg="container event discarded" container=1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1 type=CONTAINER_CREATED_EVENT May 27 03:57:35.026177 containerd[2800]: time="2025-05-27T03:57:35.026139594Z" level=warning msg="container event discarded" container=1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1 type=CONTAINER_STARTED_EVENT May 27 03:57:35.960664 containerd[2800]: time="2025-05-27T03:57:35.960610377Z" level=warning msg="container event discarded" container=098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002 type=CONTAINER_CREATED_EVENT May 27 03:57:35.960664 containerd[2800]: time="2025-05-27T03:57:35.960647137Z" level=warning msg="container event discarded" container=098b631360e74caac3332d63aad885b68a038b481a3681d7b93d324893ba3002 type=CONTAINER_STARTED_EVENT May 27 03:57:42.960868 containerd[2800]: time="2025-05-27T03:57:42.960822102Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"287791a9a962b04fd33c6008b58c834fb28215c8761f1c7469eb7857c6dfb82b\" pid:8934 exited_at:{seconds:1748318262 nanos:960502861}" May 27 03:57:43.531194 containerd[2800]: time="2025-05-27T03:57:43.531132116Z" level=warning msg="container event discarded" container=a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2 type=CONTAINER_CREATED_EVENT May 27 03:57:43.531386 containerd[2800]: time="2025-05-27T03:57:43.531356676Z" level=warning msg="container event discarded" container=a1decadc594f15907c300103d0369399152a4bdd9b85d0b6b11539a0a8aaacf2 type=CONTAINER_STARTED_EVENT May 27 03:57:43.635461 containerd[2800]: time="2025-05-27T03:57:43.635377443Z" level=warning msg="container event discarded" container=7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d type=CONTAINER_CREATED_EVENT May 27 03:57:43.635461 containerd[2800]: time="2025-05-27T03:57:43.635413523Z" level=warning msg="container event discarded" container=7e2f92492edd609aa488b3d2b24401981816e3d2b80ff1a15c755e41aa07590d type=CONTAINER_STARTED_EVENT May 27 03:57:43.635461 containerd[2800]: time="2025-05-27T03:57:43.635427643Z" level=warning msg="container event discarded" container=ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0 type=CONTAINER_CREATED_EVENT May 27 03:57:43.712798 containerd[2800]: time="2025-05-27T03:57:43.712762117Z" level=warning msg="container event discarded" container=ebe6b95635aeb59e67e53241f0282c4ea124d5682ff8652ed8dd05ad2d3523b0 type=CONTAINER_STARTED_EVENT May 27 03:57:44.520125 containerd[2800]: time="2025-05-27T03:57:44.520074799Z" level=warning msg="container event discarded" container=8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90 type=CONTAINER_CREATED_EVENT May 27 03:57:44.520125 containerd[2800]: time="2025-05-27T03:57:44.520104879Z" level=warning msg="container event discarded" container=8eff9d54c93d33be9b61c671477aaeef3377f7dae08e93db8e1216000d19ce90 type=CONTAINER_STARTED_EVENT May 27 03:57:44.625276 containerd[2800]: time="2025-05-27T03:57:44.625223047Z" level=warning msg="container event discarded" container=5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860 type=CONTAINER_CREATED_EVENT May 27 03:57:44.625276 containerd[2800]: time="2025-05-27T03:57:44.625256527Z" level=warning msg="container event discarded" container=5ea67bb47467b454a848471e0bc17146ac49702cdc52d6cc090080fd3c08d860 type=CONTAINER_STARTED_EVENT May 27 03:57:45.351333 kubelet[4330]: E0527 03:57:45.351301 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:57:46.134292 containerd[2800]: time="2025-05-27T03:57:46.134222311Z" level=warning msg="container event discarded" container=c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0 type=CONTAINER_CREATED_EVENT May 27 03:57:46.190481 containerd[2800]: time="2025-05-27T03:57:46.190437862Z" level=warning msg="container event discarded" container=c3d684e08b046a2a20cc4a6dd9c30fa25a3387eaba7a885613ae9408e3c5fff0 type=CONTAINER_STARTED_EVENT May 27 03:57:46.201607 containerd[2800]: time="2025-05-27T03:57:46.201561044Z" level=warning msg="container event discarded" container=877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42 type=CONTAINER_CREATED_EVENT May 27 03:57:46.261796 containerd[2800]: time="2025-05-27T03:57:46.261760243Z" level=warning msg="container event discarded" container=877b5fdd4eb1e40866ba09b47b0e134c85e565ba845360ee715ccdc090560c42 type=CONTAINER_STARTED_EVENT May 27 03:57:47.352475 kubelet[4330]: E0527 03:57:47.352431 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:57:47.526819 containerd[2800]: time="2025-05-27T03:57:47.526775333Z" level=warning msg="container event discarded" container=37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18 type=CONTAINER_CREATED_EVENT May 27 03:57:47.526819 containerd[2800]: time="2025-05-27T03:57:47.526808133Z" level=warning msg="container event discarded" container=37167df59457db3a4e4e872c74e374d42dbd66dcd73702c328644aa715f5fb18 type=CONTAINER_STARTED_EVENT May 27 03:57:47.634122 containerd[2800]: time="2025-05-27T03:57:47.634041864Z" level=warning msg="container event discarded" container=1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755 type=CONTAINER_CREATED_EVENT May 27 03:57:47.634122 containerd[2800]: time="2025-05-27T03:57:47.634067824Z" level=warning msg="container event discarded" container=1be40f3d5818090e9ca2fe3c7d3951d12ab1c28504594c34be381b7c88d7e755 type=CONTAINER_STARTED_EVENT May 27 03:57:47.724235 containerd[2800]: time="2025-05-27T03:57:47.724211681Z" level=warning msg="container event discarded" container=4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb type=CONTAINER_CREATED_EVENT May 27 03:57:47.724235 containerd[2800]: time="2025-05-27T03:57:47.724230281Z" level=warning msg="container event discarded" container=4d70935cc772002ecac3bda1864d458e34aa80371f9ae7e932387313c4a32fdb type=CONTAINER_STARTED_EVENT May 27 03:57:47.724298 containerd[2800]: time="2025-05-27T03:57:47.724242241Z" level=warning msg="container event discarded" container=a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59 type=CONTAINER_CREATED_EVENT May 27 03:57:47.784532 containerd[2800]: time="2025-05-27T03:57:47.784503720Z" level=warning msg="container event discarded" container=a3bb0b6c9338e39bb911e3f035db675746f4c72f3916682a29e6ea6a294c3b59 type=CONTAINER_STARTED_EVENT May 27 03:57:48.637902 containerd[2800]: time="2025-05-27T03:57:48.637862954Z" level=warning msg="container event discarded" container=3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3 type=CONTAINER_CREATED_EVENT May 27 03:57:48.704094 containerd[2800]: time="2025-05-27T03:57:48.704063604Z" level=warning msg="container event discarded" container=3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3 type=CONTAINER_STARTED_EVENT May 27 03:57:49.063129 containerd[2800]: time="2025-05-27T03:57:49.063040187Z" level=warning msg="container event discarded" container=8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500 type=CONTAINER_CREATED_EVENT May 27 03:57:49.120280 containerd[2800]: time="2025-05-27T03:57:49.120264419Z" level=warning msg="container event discarded" container=8355727e6ad9c40b73041f4a56381d8846ca38437512dc0a276ec33661752500 type=CONTAINER_STARTED_EVENT May 27 03:57:49.510866 containerd[2800]: time="2025-05-27T03:57:49.510836302Z" level=warning msg="container event discarded" container=817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84 type=CONTAINER_CREATED_EVENT May 27 03:57:49.570057 containerd[2800]: time="2025-05-27T03:57:49.570027258Z" level=warning msg="container event discarded" container=817886b3044cfdf46ca53c3de0fad6c93b2e2e1a84c0eebe410985dbeaa7ce84 type=CONTAINER_STARTED_EVENT May 27 03:57:56.014716 containerd[2800]: time="2025-05-27T03:57:56.014680271Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"b6a7b1aa3b01c3da2a492bbb0f666ebf05674871e75830ef4cbfea6db0b7ba88\" pid:8972 exited_at:{seconds:1748318276 nanos:14500991}" May 27 03:57:59.352928 kubelet[4330]: E0527 03:57:59.352878 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:58:00.352356 kubelet[4330]: E0527 03:58:00.352314 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:58:12.961073 containerd[2800]: time="2025-05-27T03:58:12.961031993Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"82c443442ec14c880045f19ce566cd5903146063c861d818dce70a2eac202b52\" pid:9007 exited_at:{seconds:1748318292 nanos:960775632}" May 27 03:58:13.352267 kubelet[4330]: E0527 03:58:13.352210 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:58:13.352599 containerd[2800]: time="2025-05-27T03:58:13.352350477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:58:13.376200 containerd[2800]: time="2025-05-27T03:58:13.376086081Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:58:13.376377 containerd[2800]: time="2025-05-27T03:58:13.376349041Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:58:13.376422 containerd[2800]: time="2025-05-27T03:58:13.376399522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:58:13.376540 kubelet[4330]: E0527 03:58:13.376503 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:58:13.376593 kubelet[4330]: E0527 03:58:13.376547 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:58:13.376850 kubelet[4330]: E0527 03:58:13.376632 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b49c0486b30c4f02b12ce8a6e8271447,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:58:13.379221 containerd[2800]: time="2025-05-27T03:58:13.379203487Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:58:13.403027 containerd[2800]: time="2025-05-27T03:58:13.402973971Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:58:13.403317 containerd[2800]: time="2025-05-27T03:58:13.403279491Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:58:13.403402 containerd[2800]: time="2025-05-27T03:58:13.403323811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:58:13.403550 kubelet[4330]: E0527 03:58:13.403497 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:58:13.403620 kubelet[4330]: E0527 03:58:13.403566 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:58:13.403765 kubelet[4330]: E0527 03:58:13.403717 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss4zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-79d4c87959-gb96j_calico-system(a3887ab7-1427-473a-8116-3289443cde48): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:58:13.404917 kubelet[4330]: E0527 03:58:13.404884 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:58:17.303398 containerd[2800]: time="2025-05-27T03:58:17.303366643Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"13f561ce93bb14fd354c1290dc7ac5e28d4a2e0090e87b29a8805bd3af38100b\" pid:9046 exited_at:{seconds:1748318297 nanos:303176403}" May 27 03:58:24.351586 kubelet[4330]: E0527 03:58:24.351541 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:58:25.351717 containerd[2800]: time="2025-05-27T03:58:25.351692527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:58:25.373186 containerd[2800]: time="2025-05-27T03:58:25.373119166Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:58:25.373470 containerd[2800]: time="2025-05-27T03:58:25.373428766Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:58:25.373557 containerd[2800]: time="2025-05-27T03:58:25.373490526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:58:25.373737 kubelet[4330]: E0527 03:58:25.373646 4330 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:58:25.374028 kubelet[4330]: E0527 03:58:25.373748 4330 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:58:25.374028 kubelet[4330]: E0527 03:58:25.373886 4330 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5z2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-tt8gf_calico-system(5718ef86-6991-4e86-870c-b776228460c0): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:58:25.375089 kubelet[4330]: E0527 03:58:25.375058 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:58:26.019533 containerd[2800]: time="2025-05-27T03:58:26.019503896Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"4cd663d20fe3614faa7d09e2da655e6b83c67ead1ed6831b4816f534fc81709d\" pid:9068 exited_at:{seconds:1748318306 nanos:19267336}" May 27 03:58:36.351975 kubelet[4330]: E0527 03:58:36.351917 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:58:40.351415 kubelet[4330]: E0527 03:58:40.351360 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:58:42.954721 containerd[2800]: time="2025-05-27T03:58:42.954684881Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"f4325237debbbb8d0d77c38c147c00ed39e6eeb0341cd69fcd562a0399a6483e\" pid:9095 exited_at:{seconds:1748318322 nanos:954377561}" May 27 03:58:48.352053 kubelet[4330]: E0527 03:58:48.352003 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:58:52.822885 update_engine[2788]: I20250527 03:58:52.822829 2788 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 27 03:58:52.822885 update_engine[2788]: I20250527 03:58:52.822885 2788 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 27 03:58:52.823200 update_engine[2788]: I20250527 03:58:52.823112 2788 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 27 03:58:52.823426 update_engine[2788]: I20250527 03:58:52.823410 2788 omaha_request_params.cc:62] Current group set to alpha May 27 03:58:52.823501 update_engine[2788]: I20250527 03:58:52.823489 2788 update_attempter.cc:499] Already updated boot flags. Skipping. May 27 03:58:52.823524 update_engine[2788]: I20250527 03:58:52.823498 2788 update_attempter.cc:643] Scheduling an action processor start. May 27 03:58:52.823524 update_engine[2788]: I20250527 03:58:52.823513 2788 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 03:58:52.823564 update_engine[2788]: I20250527 03:58:52.823535 2788 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 27 03:58:52.823593 update_engine[2788]: I20250527 03:58:52.823582 2788 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 03:58:52.823613 update_engine[2788]: I20250527 03:58:52.823589 2788 omaha_request_action.cc:272] Request: May 27 03:58:52.823613 update_engine[2788]: May 27 03:58:52.823613 update_engine[2788]: May 27 03:58:52.823613 update_engine[2788]: May 27 03:58:52.823613 update_engine[2788]: May 27 03:58:52.823613 update_engine[2788]: May 27 03:58:52.823613 update_engine[2788]: May 27 03:58:52.823613 update_engine[2788]: May 27 03:58:52.823613 update_engine[2788]: May 27 03:58:52.823613 update_engine[2788]: I20250527 03:58:52.823596 2788 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 03:58:52.823947 locksmithd[2829]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 27 03:58:52.824603 update_engine[2788]: I20250527 03:58:52.824585 2788 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 03:58:52.824908 update_engine[2788]: I20250527 03:58:52.824887 2788 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 03:58:52.825367 update_engine[2788]: E20250527 03:58:52.825350 2788 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 03:58:52.825408 update_engine[2788]: I20250527 03:58:52.825397 2788 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 27 03:58:54.351882 kubelet[4330]: E0527 03:58:54.351844 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:58:56.021664 containerd[2800]: time="2025-05-27T03:58:56.021619101Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"8dbc7c8f91a08806bc0ec56446006dd4a4593a5d02d920d1fe5686f0f940ff50\" pid:9154 exited_at:{seconds:1748318336 nanos:21432341}" May 27 03:58:59.352062 kubelet[4330]: E0527 03:58:59.352007 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:59:02.822755 update_engine[2788]: I20250527 03:59:02.822690 2788 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 03:59:02.823157 update_engine[2788]: I20250527 03:59:02.822981 2788 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 03:59:02.823222 update_engine[2788]: I20250527 03:59:02.823204 2788 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 03:59:02.918594 update_engine[2788]: E20250527 03:59:02.918555 2788 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 03:59:02.918662 update_engine[2788]: I20250527 03:59:02.918638 2788 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 27 03:59:06.352088 kubelet[4330]: E0527 03:59:06.351984 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:59:11.351447 kubelet[4330]: E0527 03:59:11.351392 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:59:12.822689 update_engine[2788]: I20250527 03:59:12.822598 2788 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 03:59:12.823119 update_engine[2788]: I20250527 03:59:12.822863 2788 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 03:59:12.823119 update_engine[2788]: I20250527 03:59:12.823081 2788 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 03:59:12.823526 update_engine[2788]: E20250527 03:59:12.823508 2788 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 03:59:12.823549 update_engine[2788]: I20250527 03:59:12.823540 2788 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 27 03:59:12.950955 containerd[2800]: time="2025-05-27T03:59:12.950904432Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"414a2ca67c0efae6a5187059d8db869f09cfceadf4b308591e384b93354b9d23\" pid:9188 exited_at:{seconds:1748318352 nanos:950627311}" May 27 03:59:17.294419 containerd[2800]: time="2025-05-27T03:59:17.294385176Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"1408021a5cc246a2d743221c1833579ac1308bcdfe17880110bf59c4b5868404\" pid:9225 exited_at:{seconds:1748318357 nanos:294237776}" May 27 03:59:17.351231 kubelet[4330]: E0527 03:59:17.351193 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:59:22.352462 kubelet[4330]: E0527 03:59:22.352415 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:59:22.822749 update_engine[2788]: I20250527 03:59:22.822692 2788 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 03:59:22.823147 update_engine[2788]: I20250527 03:59:22.822937 2788 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 03:59:22.823173 update_engine[2788]: I20250527 03:59:22.823155 2788 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 03:59:22.823694 update_engine[2788]: E20250527 03:59:22.823658 2788 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 03:59:22.823719 update_engine[2788]: I20250527 03:59:22.823707 2788 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 03:59:22.823739 update_engine[2788]: I20250527 03:59:22.823716 2788 omaha_request_action.cc:617] Omaha request response: May 27 03:59:22.823796 update_engine[2788]: E20250527 03:59:22.823784 2788 omaha_request_action.cc:636] Omaha request network transfer failed. May 27 03:59:22.823818 update_engine[2788]: I20250527 03:59:22.823801 2788 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 27 03:59:22.823818 update_engine[2788]: I20250527 03:59:22.823807 2788 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 03:59:22.823818 update_engine[2788]: I20250527 03:59:22.823812 2788 update_attempter.cc:306] Processing Done. May 27 03:59:22.823876 update_engine[2788]: E20250527 03:59:22.823823 2788 update_attempter.cc:619] Update failed. May 27 03:59:22.823876 update_engine[2788]: I20250527 03:59:22.823828 2788 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 27 03:59:22.823876 update_engine[2788]: I20250527 03:59:22.823832 2788 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 27 03:59:22.823876 update_engine[2788]: I20250527 03:59:22.823837 2788 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 27 03:59:22.823946 update_engine[2788]: I20250527 03:59:22.823891 2788 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 03:59:22.823946 update_engine[2788]: I20250527 03:59:22.823911 2788 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 03:59:22.823946 update_engine[2788]: I20250527 03:59:22.823915 2788 omaha_request_action.cc:272] Request: May 27 03:59:22.823946 update_engine[2788]: May 27 03:59:22.823946 update_engine[2788]: May 27 03:59:22.823946 update_engine[2788]: May 27 03:59:22.823946 update_engine[2788]: May 27 03:59:22.823946 update_engine[2788]: May 27 03:59:22.823946 update_engine[2788]: May 27 03:59:22.823946 update_engine[2788]: I20250527 03:59:22.823920 2788 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 03:59:22.824107 update_engine[2788]: I20250527 03:59:22.824026 2788 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 03:59:22.824186 locksmithd[2829]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 27 03:59:22.824346 update_engine[2788]: I20250527 03:59:22.824199 2788 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 03:59:22.824579 update_engine[2788]: E20250527 03:59:22.824563 2788 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 03:59:22.824601 update_engine[2788]: I20250527 03:59:22.824591 2788 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 03:59:22.824601 update_engine[2788]: I20250527 03:59:22.824597 2788 omaha_request_action.cc:617] Omaha request response: May 27 03:59:22.824638 update_engine[2788]: I20250527 03:59:22.824602 2788 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 03:59:22.824638 update_engine[2788]: I20250527 03:59:22.824607 2788 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 03:59:22.824638 update_engine[2788]: I20250527 03:59:22.824611 2788 update_attempter.cc:306] Processing Done. May 27 03:59:22.824638 update_engine[2788]: I20250527 03:59:22.824615 2788 update_attempter.cc:310] Error event sent. May 27 03:59:22.824638 update_engine[2788]: I20250527 03:59:22.824622 2788 update_check_scheduler.cc:74] Next update check in 44m51s May 27 03:59:22.824804 locksmithd[2829]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 27 03:59:26.019388 containerd[2800]: time="2025-05-27T03:59:26.019348218Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"e41d3d6bb7c8d995fcb2f624a70fb6795e6e15d2835e8c2863eae4eebaf02d58\" pid:9247 exited_at:{seconds:1748318366 nanos:19163978}" May 27 03:59:29.351998 kubelet[4330]: E0527 03:59:29.351950 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:59:35.351974 kubelet[4330]: E0527 03:59:35.351913 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:59:40.351942 kubelet[4330]: E0527 03:59:40.351886 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:59:42.961632 containerd[2800]: time="2025-05-27T03:59:42.961590155Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"4b4a0e9c09fe0963695420aa278513ed54c2533ac1413e027ec6e55c973e96e4\" pid:9269 exited_at:{seconds:1748318382 nanos:961299954}" May 27 03:59:50.352346 kubelet[4330]: E0527 03:59:50.352287 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 03:59:54.351756 kubelet[4330]: E0527 03:59:54.351692 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 03:59:56.022798 containerd[2800]: time="2025-05-27T03:59:56.022767080Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"68fe1b768d8318cdd25cd5e788bedf236dedc543c74eb7c0277ac00d087fc737\" pid:9305 exited_at:{seconds:1748318396 nanos:22618920}" May 27 04:00:01.352335 kubelet[4330]: E0527 04:00:01.352271 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 04:00:08.351951 kubelet[4330]: E0527 04:00:08.351878 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 04:00:12.962457 containerd[2800]: time="2025-05-27T04:00:12.962410679Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"5b229d39dd1892b6392769a75d9116a01d1739bb01ca30b6285fdc1b26ed2935\" pid:9329 exited_at:{seconds:1748318412 nanos:962160318}" May 27 04:00:16.352507 kubelet[4330]: E0527 04:00:16.352461 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 04:00:17.304575 containerd[2800]: time="2025-05-27T04:00:17.304539058Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"ccee62b3c099591d3bdca7dff524743c1dd4945f4856bbe11743e0b00adf1806\" pid:9366 exited_at:{seconds:1748318417 nanos:304362098}" May 27 04:00:23.351321 kubelet[4330]: E0527 04:00:23.351271 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 04:00:26.021570 containerd[2800]: time="2025-05-27T04:00:26.021535647Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"7972c257ad347fe4ce3857e9a129608b5d0dfff48f4ec2271a64d215106b229c\" pid:9388 exited_at:{seconds:1748318426 nanos:21344207}" May 27 04:00:31.351379 kubelet[4330]: E0527 04:00:31.351325 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 04:00:35.352111 kubelet[4330]: E0527 04:00:35.352033 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 04:00:42.957607 containerd[2800]: time="2025-05-27T04:00:42.957554537Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"a77cc467e40d9b0a95d5eb20e9823441746fcb6d86ebe5486b3ceabbbd4a6893\" pid:9436 exited_at:{seconds:1748318442 nanos:957316096}" May 27 04:00:43.330599 systemd[1]: Started sshd@10-147.28.163.138:22-139.178.89.65:56842.service - OpenSSH per-connection server daemon (139.178.89.65:56842). May 27 04:00:43.746195 sshd[9463]: Accepted publickey for core from 139.178.89.65 port 56842 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:00:43.747371 sshd-session[9463]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:00:43.750837 systemd-logind[2785]: New session 10 of user core. May 27 04:00:43.763774 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 04:00:44.106283 sshd[9465]: Connection closed by 139.178.89.65 port 56842 May 27 04:00:44.106713 sshd-session[9463]: pam_unix(sshd:session): session closed for user core May 27 04:00:44.109766 systemd[1]: sshd@10-147.28.163.138:22-139.178.89.65:56842.service: Deactivated successfully. May 27 04:00:44.112087 systemd[1]: session-10.scope: Deactivated successfully. May 27 04:00:44.112691 systemd-logind[2785]: Session 10 logged out. Waiting for processes to exit. May 27 04:00:44.113564 systemd-logind[2785]: Removed session 10. May 27 04:00:46.352373 kubelet[4330]: E0527 04:00:46.352302 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 04:00:48.351932 kubelet[4330]: E0527 04:00:48.351882 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 04:00:49.187573 systemd[1]: Started sshd@11-147.28.163.138:22-139.178.89.65:56856.service - OpenSSH per-connection server daemon (139.178.89.65:56856). May 27 04:00:49.592402 sshd[9510]: Accepted publickey for core from 139.178.89.65 port 56856 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:00:49.593722 sshd-session[9510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:00:49.597035 systemd-logind[2785]: New session 11 of user core. May 27 04:00:49.614830 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 04:00:49.940097 sshd[9512]: Connection closed by 139.178.89.65 port 56856 May 27 04:00:49.940401 sshd-session[9510]: pam_unix(sshd:session): session closed for user core May 27 04:00:49.943529 systemd[1]: sshd@11-147.28.163.138:22-139.178.89.65:56856.service: Deactivated successfully. May 27 04:00:49.945851 systemd[1]: session-11.scope: Deactivated successfully. May 27 04:00:49.946438 systemd-logind[2785]: Session 11 logged out. Waiting for processes to exit. May 27 04:00:49.947318 systemd-logind[2785]: Removed session 11. May 27 04:00:50.026526 systemd[1]: Started sshd@12-147.28.163.138:22-139.178.89.65:56860.service - OpenSSH per-connection server daemon (139.178.89.65:56860). May 27 04:00:50.428427 sshd[9550]: Accepted publickey for core from 139.178.89.65 port 56860 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:00:50.429707 sshd-session[9550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:00:50.433033 systemd-logind[2785]: New session 12 of user core. May 27 04:00:50.454793 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 04:00:50.798985 sshd[9552]: Connection closed by 139.178.89.65 port 56860 May 27 04:00:50.799342 sshd-session[9550]: pam_unix(sshd:session): session closed for user core May 27 04:00:50.802425 systemd[1]: sshd@12-147.28.163.138:22-139.178.89.65:56860.service: Deactivated successfully. May 27 04:00:50.804140 systemd[1]: session-12.scope: Deactivated successfully. May 27 04:00:50.804717 systemd-logind[2785]: Session 12 logged out. Waiting for processes to exit. May 27 04:00:50.805576 systemd-logind[2785]: Removed session 12. May 27 04:00:50.890737 systemd[1]: Started sshd@13-147.28.163.138:22-139.178.89.65:56866.service - OpenSSH per-connection server daemon (139.178.89.65:56866). May 27 04:00:51.292742 sshd[9593]: Accepted publickey for core from 139.178.89.65 port 56866 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:00:51.294173 sshd-session[9593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:00:51.297552 systemd-logind[2785]: New session 13 of user core. May 27 04:00:51.313808 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 04:00:51.638895 sshd[9595]: Connection closed by 139.178.89.65 port 56866 May 27 04:00:51.639227 sshd-session[9593]: pam_unix(sshd:session): session closed for user core May 27 04:00:51.642175 systemd[1]: sshd@13-147.28.163.138:22-139.178.89.65:56866.service: Deactivated successfully. May 27 04:00:51.643902 systemd[1]: session-13.scope: Deactivated successfully. May 27 04:00:51.644464 systemd-logind[2785]: Session 13 logged out. Waiting for processes to exit. May 27 04:00:51.645311 systemd-logind[2785]: Removed session 13. May 27 04:00:56.022514 containerd[2800]: time="2025-05-27T04:00:56.022469842Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"fa98982cbf87940c948ef4c5ee04e5c1a0ba6ee3460237cc17f9fa64d12c9b67\" pid:9645 exited_at:{seconds:1748318456 nanos:22277562}" May 27 04:00:56.717640 systemd[1]: Started sshd@14-147.28.163.138:22-139.178.89.65:53474.service - OpenSSH per-connection server daemon (139.178.89.65:53474). May 27 04:00:57.128092 sshd[9656]: Accepted publickey for core from 139.178.89.65 port 53474 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:00:57.129244 sshd-session[9656]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:00:57.132277 systemd-logind[2785]: New session 14 of user core. May 27 04:00:57.155841 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 04:00:57.480073 sshd[9658]: Connection closed by 139.178.89.65 port 53474 May 27 04:00:57.480364 sshd-session[9656]: pam_unix(sshd:session): session closed for user core May 27 04:00:57.483286 systemd[1]: sshd@14-147.28.163.138:22-139.178.89.65:53474.service: Deactivated successfully. May 27 04:00:57.485036 systemd[1]: session-14.scope: Deactivated successfully. May 27 04:00:57.485605 systemd-logind[2785]: Session 14 logged out. Waiting for processes to exit. May 27 04:00:57.486473 systemd-logind[2785]: Removed session 14. May 27 04:00:59.352134 kubelet[4330]: E0527 04:00:59.352084 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 04:01:00.351950 kubelet[4330]: E0527 04:01:00.351913 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 04:01:02.554717 systemd[1]: Started sshd@15-147.28.163.138:22-139.178.89.65:53486.service - OpenSSH per-connection server daemon (139.178.89.65:53486). May 27 04:01:02.953658 sshd[9699]: Accepted publickey for core from 139.178.89.65 port 53486 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:02.954648 sshd-session[9699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:02.957846 systemd-logind[2785]: New session 15 of user core. May 27 04:01:02.974773 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 04:01:03.298181 sshd[9702]: Connection closed by 139.178.89.65 port 53486 May 27 04:01:03.298550 sshd-session[9699]: pam_unix(sshd:session): session closed for user core May 27 04:01:03.301520 systemd[1]: sshd@15-147.28.163.138:22-139.178.89.65:53486.service: Deactivated successfully. May 27 04:01:03.303725 systemd[1]: session-15.scope: Deactivated successfully. May 27 04:01:03.304278 systemd-logind[2785]: Session 15 logged out. Waiting for processes to exit. May 27 04:01:03.305096 systemd-logind[2785]: Removed session 15. May 27 04:01:08.375444 systemd[1]: Started sshd@16-147.28.163.138:22-139.178.89.65:50894.service - OpenSSH per-connection server daemon (139.178.89.65:50894). May 27 04:01:08.777407 sshd[9743]: Accepted publickey for core from 139.178.89.65 port 50894 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:08.778621 sshd-session[9743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:08.781805 systemd-logind[2785]: New session 16 of user core. May 27 04:01:08.804768 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 04:01:09.126479 sshd[9745]: Connection closed by 139.178.89.65 port 50894 May 27 04:01:09.126790 sshd-session[9743]: pam_unix(sshd:session): session closed for user core May 27 04:01:09.129761 systemd[1]: sshd@16-147.28.163.138:22-139.178.89.65:50894.service: Deactivated successfully. May 27 04:01:09.131367 systemd[1]: session-16.scope: Deactivated successfully. May 27 04:01:09.131912 systemd-logind[2785]: Session 16 logged out. Waiting for processes to exit. May 27 04:01:09.132727 systemd-logind[2785]: Removed session 16. May 27 04:01:09.197417 systemd[1]: Started sshd@17-147.28.163.138:22-139.178.89.65:50910.service - OpenSSH per-connection server daemon (139.178.89.65:50910). May 27 04:01:09.618037 sshd[9778]: Accepted publickey for core from 139.178.89.65 port 50910 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:09.619188 sshd-session[9778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:09.622119 systemd-logind[2785]: New session 17 of user core. May 27 04:01:09.631817 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 04:01:09.984889 sshd[9780]: Connection closed by 139.178.89.65 port 50910 May 27 04:01:09.985146 sshd-session[9778]: pam_unix(sshd:session): session closed for user core May 27 04:01:09.988183 systemd[1]: sshd@17-147.28.163.138:22-139.178.89.65:50910.service: Deactivated successfully. May 27 04:01:09.990423 systemd[1]: session-17.scope: Deactivated successfully. May 27 04:01:09.990997 systemd-logind[2785]: Session 17 logged out. Waiting for processes to exit. May 27 04:01:09.991852 systemd-logind[2785]: Removed session 17. May 27 04:01:10.062490 systemd[1]: Started sshd@18-147.28.163.138:22-139.178.89.65:50912.service - OpenSSH per-connection server daemon (139.178.89.65:50912). May 27 04:01:10.462808 sshd[9812]: Accepted publickey for core from 139.178.89.65 port 50912 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:10.463995 sshd-session[9812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:10.466991 systemd-logind[2785]: New session 18 of user core. May 27 04:01:10.483835 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 04:01:11.222517 sshd[9814]: Connection closed by 139.178.89.65 port 50912 May 27 04:01:11.222903 sshd-session[9812]: pam_unix(sshd:session): session closed for user core May 27 04:01:11.225969 systemd[1]: sshd@18-147.28.163.138:22-139.178.89.65:50912.service: Deactivated successfully. May 27 04:01:11.228298 systemd[1]: session-18.scope: Deactivated successfully. May 27 04:01:11.229499 systemd-logind[2785]: Session 18 logged out. Waiting for processes to exit. May 27 04:01:11.230496 systemd-logind[2785]: Removed session 18. May 27 04:01:11.303520 systemd[1]: Started sshd@19-147.28.163.138:22-139.178.89.65:50920.service - OpenSSH per-connection server daemon (139.178.89.65:50920). May 27 04:01:11.716561 sshd[9871]: Accepted publickey for core from 139.178.89.65 port 50920 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:11.717836 sshd-session[9871]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:11.720994 systemd-logind[2785]: New session 19 of user core. May 27 04:01:11.743806 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 04:01:12.161871 sshd[9873]: Connection closed by 139.178.89.65 port 50920 May 27 04:01:12.162193 sshd-session[9871]: pam_unix(sshd:session): session closed for user core May 27 04:01:12.165516 systemd[1]: sshd@19-147.28.163.138:22-139.178.89.65:50920.service: Deactivated successfully. May 27 04:01:12.167794 systemd[1]: session-19.scope: Deactivated successfully. May 27 04:01:12.168413 systemd-logind[2785]: Session 19 logged out. Waiting for processes to exit. May 27 04:01:12.169240 systemd-logind[2785]: Removed session 19. May 27 04:01:12.238452 systemd[1]: Started sshd@20-147.28.163.138:22-139.178.89.65:50928.service - OpenSSH per-connection server daemon (139.178.89.65:50928). May 27 04:01:12.650937 sshd[9922]: Accepted publickey for core from 139.178.89.65 port 50928 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:12.652169 sshd-session[9922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:12.655324 systemd-logind[2785]: New session 20 of user core. May 27 04:01:12.671825 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 04:01:12.965520 containerd[2800]: time="2025-05-27T04:01:12.965417527Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1b566b40be0d971411893bc609dddc48e3c7e9a4ffa8f3326c266043061890a1\" id:\"a14284b80ff000c648692c9248c135db0faf2ba0f210da8dd400365f541a7350\" pid:9952 exited_at:{seconds:1748318472 nanos:965157527}" May 27 04:01:13.003754 sshd[9924]: Connection closed by 139.178.89.65 port 50928 May 27 04:01:13.004144 sshd-session[9922]: pam_unix(sshd:session): session closed for user core May 27 04:01:13.007204 systemd[1]: sshd@20-147.28.163.138:22-139.178.89.65:50928.service: Deactivated successfully. May 27 04:01:13.008848 systemd[1]: session-20.scope: Deactivated successfully. May 27 04:01:13.009408 systemd-logind[2785]: Session 20 logged out. Waiting for processes to exit. May 27 04:01:13.010264 systemd-logind[2785]: Removed session 20. May 27 04:01:13.351875 kubelet[4330]: E0527 04:01:13.351825 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 04:01:13.352204 kubelet[4330]: E0527 04:01:13.352015 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 04:01:17.312519 containerd[2800]: time="2025-05-27T04:01:17.312481341Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"cdd369da2adec94f2fbff269a759c6afa1811c3062c2cf59bc653382199d0cde\" pid:10012 exited_at:{seconds:1748318477 nanos:312266621}" May 27 04:01:18.078642 systemd[1]: Started sshd@21-147.28.163.138:22-139.178.89.65:50024.service - OpenSSH per-connection server daemon (139.178.89.65:50024). May 27 04:01:18.478847 sshd[10023]: Accepted publickey for core from 139.178.89.65 port 50024 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:18.480111 sshd-session[10023]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:18.483503 systemd-logind[2785]: New session 21 of user core. May 27 04:01:18.493800 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 04:01:18.826776 sshd[10025]: Connection closed by 139.178.89.65 port 50024 May 27 04:01:18.827174 sshd-session[10023]: pam_unix(sshd:session): session closed for user core May 27 04:01:18.830289 systemd[1]: sshd@21-147.28.163.138:22-139.178.89.65:50024.service: Deactivated successfully. May 27 04:01:18.831951 systemd[1]: session-21.scope: Deactivated successfully. May 27 04:01:18.832504 systemd-logind[2785]: Session 21 logged out. Waiting for processes to exit. May 27 04:01:18.833314 systemd-logind[2785]: Removed session 21. May 27 04:01:23.914559 systemd[1]: Started sshd@22-147.28.163.138:22-139.178.89.65:53678.service - OpenSSH per-connection server daemon (139.178.89.65:53678). May 27 04:01:24.322117 sshd[10059]: Accepted publickey for core from 139.178.89.65 port 53678 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:24.323289 sshd-session[10059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:24.326307 systemd-logind[2785]: New session 22 of user core. May 27 04:01:24.336835 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 04:01:24.351804 kubelet[4330]: E0527 04:01:24.351763 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-79d4c87959-gb96j" podUID="a3887ab7-1427-473a-8116-3289443cde48" May 27 04:01:24.672879 sshd[10061]: Connection closed by 139.178.89.65 port 53678 May 27 04:01:24.673206 sshd-session[10059]: pam_unix(sshd:session): session closed for user core May 27 04:01:24.676285 systemd[1]: sshd@22-147.28.163.138:22-139.178.89.65:53678.service: Deactivated successfully. May 27 04:01:24.677961 systemd[1]: session-22.scope: Deactivated successfully. May 27 04:01:24.678522 systemd-logind[2785]: Session 22 logged out. Waiting for processes to exit. May 27 04:01:24.679360 systemd-logind[2785]: Removed session 22. May 27 04:01:26.021535 containerd[2800]: time="2025-05-27T04:01:26.021501578Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3b3d17e9c4d7da0d46fc3ea0492776a20e01611af9237d4e06d6de3ad2786aa3\" id:\"9f73b476460175c331d54766d6ab6ac68a0fba9330855e900ba895446d2143c2\" pid:10103 exited_at:{seconds:1748318486 nanos:21297097}" May 27 04:01:26.351836 kubelet[4330]: E0527 04:01:26.351777 4330 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-tt8gf" podUID="5718ef86-6991-4e86-870c-b776228460c0" May 27 04:01:29.759613 systemd[1]: Started sshd@23-147.28.163.138:22-139.178.89.65:53694.service - OpenSSH per-connection server daemon (139.178.89.65:53694). May 27 04:01:30.163002 sshd[10115]: Accepted publickey for core from 139.178.89.65 port 53694 ssh2: RSA SHA256:HJh6UTMgN85bG8HCxILOWNqxpjzDPDuMvm+Ey/HdZfA May 27 04:01:30.164190 sshd-session[10115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 04:01:30.167534 systemd-logind[2785]: New session 23 of user core. May 27 04:01:30.187839 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 04:01:30.509277 sshd[10117]: Connection closed by 139.178.89.65 port 53694 May 27 04:01:30.509616 sshd-session[10115]: pam_unix(sshd:session): session closed for user core May 27 04:01:30.512625 systemd[1]: sshd@23-147.28.163.138:22-139.178.89.65:53694.service: Deactivated successfully. May 27 04:01:30.515001 systemd[1]: session-23.scope: Deactivated successfully. May 27 04:01:30.515694 systemd-logind[2785]: Session 23 logged out. Waiting for processes to exit. May 27 04:01:30.516629 systemd-logind[2785]: Removed session 23.