May 27 18:07:19.185704 kernel: Booting Linux on physical CPU 0x0000120000 [0x413fd0c1] May 27 18:07:19.185726 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 27 15:31:23 -00 2025 May 27 18:07:19.185734 kernel: KASLR enabled May 27 18:07:19.185740 kernel: efi: EFI v2.7 by American Megatrends May 27 18:07:19.185745 kernel: efi: ACPI 2.0=0xec080000 SMBIOS 3.0=0xf0a1ff98 ESRT=0xea47e818 RNG=0xebf10018 MEMRESERVE=0xe465af98 May 27 18:07:19.185751 kernel: random: crng init done May 27 18:07:19.185757 kernel: secureboot: Secure boot disabled May 27 18:07:19.185763 kernel: esrt: Reserving ESRT space from 0x00000000ea47e818 to 0x00000000ea47e878. May 27 18:07:19.185770 kernel: ACPI: Early table checksum verification disabled May 27 18:07:19.185776 kernel: ACPI: RSDP 0x00000000EC080000 000024 (v02 Ampere) May 27 18:07:19.185782 kernel: ACPI: XSDT 0x00000000EC070000 0000A4 (v01 Ampere Altra 00000000 AMI 01000013) May 27 18:07:19.185788 kernel: ACPI: FACP 0x00000000EC050000 000114 (v06 Ampere Altra 00000000 INTL 20190509) May 27 18:07:19.185794 kernel: ACPI: DSDT 0x00000000EBFF0000 019B57 (v02 Ampere Jade 00000001 INTL 20200717) May 27 18:07:19.185800 kernel: ACPI: DBG2 0x00000000EC060000 00005C (v00 Ampere Altra 00000000 INTL 20190509) May 27 18:07:19.185808 kernel: ACPI: GTDT 0x00000000EC040000 000110 (v03 Ampere Altra 00000000 INTL 20190509) May 27 18:07:19.185814 kernel: ACPI: SSDT 0x00000000EC030000 00002D (v02 Ampere Altra 00000001 INTL 20190509) May 27 18:07:19.185820 kernel: ACPI: FIDT 0x00000000EBFE0000 00009C (v01 ALASKA A M I 01072009 AMI 00010013) May 27 18:07:19.185826 kernel: ACPI: SPCR 0x00000000EBFD0000 000050 (v02 ALASKA A M I 01072009 AMI 0005000F) May 27 18:07:19.185832 kernel: ACPI: BGRT 0x00000000EBFC0000 000038 (v01 ALASKA A M I 01072009 AMI 00010013) May 27 18:07:19.185838 kernel: ACPI: MCFG 0x00000000EBFB0000 0000AC (v01 Ampere Altra 00000001 AMP. 01000013) May 27 18:07:19.185844 kernel: ACPI: IORT 0x00000000EBFA0000 000610 (v00 Ampere Altra 00000000 AMP. 01000013) May 27 18:07:19.185850 kernel: ACPI: PPTT 0x00000000EBF80000 006E60 (v02 Ampere Altra 00000000 AMP. 01000013) May 27 18:07:19.185856 kernel: ACPI: SLIT 0x00000000EBF70000 00002D (v01 Ampere Altra 00000000 AMP. 01000013) May 27 18:07:19.185863 kernel: ACPI: SRAT 0x00000000EBF60000 0006D0 (v03 Ampere Altra 00000000 AMP. 01000013) May 27 18:07:19.185870 kernel: ACPI: APIC 0x00000000EBF90000 0019F4 (v05 Ampere Altra 00000003 AMI 01000013) May 27 18:07:19.185876 kernel: ACPI: PCCT 0x00000000EBF40000 000576 (v02 Ampere Altra 00000003 AMP. 01000013) May 27 18:07:19.185882 kernel: ACPI: WSMT 0x00000000EBF30000 000028 (v01 ALASKA A M I 01072009 AMI 00010013) May 27 18:07:19.185888 kernel: ACPI: FPDT 0x00000000EBF20000 000044 (v01 ALASKA A M I 01072009 AMI 01000013) May 27 18:07:19.185894 kernel: ACPI: SPCR: console: pl011,mmio32,0x100002600000,115200 May 27 18:07:19.185900 kernel: ACPI: Use ACPI SPCR as default console: Yes May 27 18:07:19.185906 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x88300000-0x883fffff] May 27 18:07:19.185912 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x90000000-0xffffffff] May 27 18:07:19.185918 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0x8007fffffff] May 27 18:07:19.185924 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80100000000-0x83fffffffff] May 27 18:07:19.185930 kernel: NUMA: Initialized distance table, cnt=1 May 27 18:07:19.185938 kernel: NUMA: Node 0 [mem 0x88300000-0x883fffff] + [mem 0x90000000-0xffffffff] -> [mem 0x88300000-0xffffffff] May 27 18:07:19.185944 kernel: NUMA: Node 0 [mem 0x88300000-0xffffffff] + [mem 0x80000000000-0x8007fffffff] -> [mem 0x88300000-0x8007fffffff] May 27 18:07:19.185951 kernel: NUMA: Node 0 [mem 0x88300000-0x8007fffffff] + [mem 0x80100000000-0x83fffffffff] -> [mem 0x88300000-0x83fffffffff] May 27 18:07:19.185957 kernel: NODE_DATA(0) allocated [mem 0x83fdffd8dc0-0x83fdffdffff] May 27 18:07:19.185963 kernel: Zone ranges: May 27 18:07:19.185972 kernel: DMA [mem 0x0000000088300000-0x00000000ffffffff] May 27 18:07:19.185980 kernel: DMA32 empty May 27 18:07:19.185986 kernel: Normal [mem 0x0000000100000000-0x0000083fffffffff] May 27 18:07:19.185992 kernel: Device empty May 27 18:07:19.185999 kernel: Movable zone start for each node May 27 18:07:19.186005 kernel: Early memory node ranges May 27 18:07:19.186011 kernel: node 0: [mem 0x0000000088300000-0x00000000883fffff] May 27 18:07:19.186018 kernel: node 0: [mem 0x0000000090000000-0x0000000091ffffff] May 27 18:07:19.186024 kernel: node 0: [mem 0x0000000092000000-0x0000000093ffffff] May 27 18:07:19.186031 kernel: node 0: [mem 0x0000000094000000-0x00000000eba37fff] May 27 18:07:19.186037 kernel: node 0: [mem 0x00000000eba38000-0x00000000ebeccfff] May 27 18:07:19.186044 kernel: node 0: [mem 0x00000000ebecd000-0x00000000ebecdfff] May 27 18:07:19.186051 kernel: node 0: [mem 0x00000000ebece000-0x00000000ebecffff] May 27 18:07:19.186058 kernel: node 0: [mem 0x00000000ebed0000-0x00000000ec0effff] May 27 18:07:19.186064 kernel: node 0: [mem 0x00000000ec0f0000-0x00000000ec0fffff] May 27 18:07:19.186070 kernel: node 0: [mem 0x00000000ec100000-0x00000000ee53ffff] May 27 18:07:19.186077 kernel: node 0: [mem 0x00000000ee540000-0x00000000f765ffff] May 27 18:07:19.186083 kernel: node 0: [mem 0x00000000f7660000-0x00000000f784ffff] May 27 18:07:19.186089 kernel: node 0: [mem 0x00000000f7850000-0x00000000f7fdffff] May 27 18:07:19.186096 kernel: node 0: [mem 0x00000000f7fe0000-0x00000000ffc8efff] May 27 18:07:19.186102 kernel: node 0: [mem 0x00000000ffc8f000-0x00000000ffc8ffff] May 27 18:07:19.186108 kernel: node 0: [mem 0x00000000ffc90000-0x00000000ffffffff] May 27 18:07:19.186115 kernel: node 0: [mem 0x0000080000000000-0x000008007fffffff] May 27 18:07:19.186122 kernel: node 0: [mem 0x0000080100000000-0x0000083fffffffff] May 27 18:07:19.186129 kernel: Initmem setup node 0 [mem 0x0000000088300000-0x0000083fffffffff] May 27 18:07:19.186135 kernel: On node 0, zone DMA: 768 pages in unavailable ranges May 27 18:07:19.186141 kernel: On node 0, zone DMA: 31744 pages in unavailable ranges May 27 18:07:19.186148 kernel: psci: probing for conduit method from ACPI. May 27 18:07:19.186154 kernel: psci: PSCIv1.1 detected in firmware. May 27 18:07:19.186161 kernel: psci: Using standard PSCI v0.2 function IDs May 27 18:07:19.186167 kernel: psci: MIGRATE_INFO_TYPE not supported. May 27 18:07:19.186173 kernel: psci: SMC Calling Convention v1.2 May 27 18:07:19.186180 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 May 27 18:07:19.186186 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x100 -> Node 0 May 27 18:07:19.186192 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x10000 -> Node 0 May 27 18:07:19.186200 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x10100 -> Node 0 May 27 18:07:19.186206 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x20000 -> Node 0 May 27 18:07:19.186213 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x20100 -> Node 0 May 27 18:07:19.186219 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x30000 -> Node 0 May 27 18:07:19.186226 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x30100 -> Node 0 May 27 18:07:19.186232 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x40000 -> Node 0 May 27 18:07:19.186238 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x40100 -> Node 0 May 27 18:07:19.186245 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x50000 -> Node 0 May 27 18:07:19.186251 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x50100 -> Node 0 May 27 18:07:19.186257 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x60000 -> Node 0 May 27 18:07:19.186264 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x60100 -> Node 0 May 27 18:07:19.186271 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x70000 -> Node 0 May 27 18:07:19.186278 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x70100 -> Node 0 May 27 18:07:19.186284 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x80000 -> Node 0 May 27 18:07:19.186290 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x80100 -> Node 0 May 27 18:07:19.186297 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x90000 -> Node 0 May 27 18:07:19.186303 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x90100 -> Node 0 May 27 18:07:19.186309 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xa0000 -> Node 0 May 27 18:07:19.186316 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xa0100 -> Node 0 May 27 18:07:19.186322 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xb0000 -> Node 0 May 27 18:07:19.186328 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xb0100 -> Node 0 May 27 18:07:19.186335 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xc0000 -> Node 0 May 27 18:07:19.186341 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xc0100 -> Node 0 May 27 18:07:19.186349 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xd0000 -> Node 0 May 27 18:07:19.186355 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xd0100 -> Node 0 May 27 18:07:19.186362 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xe0000 -> Node 0 May 27 18:07:19.186368 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xe0100 -> Node 0 May 27 18:07:19.186375 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xf0000 -> Node 0 May 27 18:07:19.186381 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0xf0100 -> Node 0 May 27 18:07:19.186387 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x100000 -> Node 0 May 27 18:07:19.186394 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x100100 -> Node 0 May 27 18:07:19.186400 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x110000 -> Node 0 May 27 18:07:19.186407 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x110100 -> Node 0 May 27 18:07:19.186413 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x120000 -> Node 0 May 27 18:07:19.186421 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x120100 -> Node 0 May 27 18:07:19.186427 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x130000 -> Node 0 May 27 18:07:19.186433 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x130100 -> Node 0 May 27 18:07:19.186440 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x140000 -> Node 0 May 27 18:07:19.186446 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x140100 -> Node 0 May 27 18:07:19.186453 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x150000 -> Node 0 May 27 18:07:19.186459 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x150100 -> Node 0 May 27 18:07:19.186465 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x160000 -> Node 0 May 27 18:07:19.186472 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x160100 -> Node 0 May 27 18:07:19.186484 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x170000 -> Node 0 May 27 18:07:19.186492 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x170100 -> Node 0 May 27 18:07:19.186499 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x180000 -> Node 0 May 27 18:07:19.186506 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x180100 -> Node 0 May 27 18:07:19.186513 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x190000 -> Node 0 May 27 18:07:19.186519 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x190100 -> Node 0 May 27 18:07:19.186526 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1a0000 -> Node 0 May 27 18:07:19.186534 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1a0100 -> Node 0 May 27 18:07:19.186541 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1b0000 -> Node 0 May 27 18:07:19.186547 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1b0100 -> Node 0 May 27 18:07:19.186554 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1c0000 -> Node 0 May 27 18:07:19.186561 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1c0100 -> Node 0 May 27 18:07:19.186568 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1d0000 -> Node 0 May 27 18:07:19.186574 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1d0100 -> Node 0 May 27 18:07:19.186585 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1e0000 -> Node 0 May 27 18:07:19.186593 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1e0100 -> Node 0 May 27 18:07:19.186599 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1f0000 -> Node 0 May 27 18:07:19.186606 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1f0100 -> Node 0 May 27 18:07:19.186613 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x200000 -> Node 0 May 27 18:07:19.186621 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x200100 -> Node 0 May 27 18:07:19.186628 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x210000 -> Node 0 May 27 18:07:19.186635 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x210100 -> Node 0 May 27 18:07:19.186642 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x220000 -> Node 0 May 27 18:07:19.186648 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x220100 -> Node 0 May 27 18:07:19.186655 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x230000 -> Node 0 May 27 18:07:19.186662 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x230100 -> Node 0 May 27 18:07:19.186669 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x240000 -> Node 0 May 27 18:07:19.186675 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x240100 -> Node 0 May 27 18:07:19.186682 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x250000 -> Node 0 May 27 18:07:19.186689 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x250100 -> Node 0 May 27 18:07:19.186696 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x260000 -> Node 0 May 27 18:07:19.186704 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x260100 -> Node 0 May 27 18:07:19.186710 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x270000 -> Node 0 May 27 18:07:19.186717 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x270100 -> Node 0 May 27 18:07:19.186724 kernel: percpu: Embedded 33 pages/cpu s98136 r8192 d28840 u135168 May 27 18:07:19.186731 kernel: pcpu-alloc: s98136 r8192 d28840 u135168 alloc=33*4096 May 27 18:07:19.186738 kernel: pcpu-alloc: [0] 00 [0] 01 [0] 02 [0] 03 [0] 04 [0] 05 [0] 06 [0] 07 May 27 18:07:19.186744 kernel: pcpu-alloc: [0] 08 [0] 09 [0] 10 [0] 11 [0] 12 [0] 13 [0] 14 [0] 15 May 27 18:07:19.186751 kernel: pcpu-alloc: [0] 16 [0] 17 [0] 18 [0] 19 [0] 20 [0] 21 [0] 22 [0] 23 May 27 18:07:19.186758 kernel: pcpu-alloc: [0] 24 [0] 25 [0] 26 [0] 27 [0] 28 [0] 29 [0] 30 [0] 31 May 27 18:07:19.186765 kernel: pcpu-alloc: [0] 32 [0] 33 [0] 34 [0] 35 [0] 36 [0] 37 [0] 38 [0] 39 May 27 18:07:19.186771 kernel: pcpu-alloc: [0] 40 [0] 41 [0] 42 [0] 43 [0] 44 [0] 45 [0] 46 [0] 47 May 27 18:07:19.186779 kernel: pcpu-alloc: [0] 48 [0] 49 [0] 50 [0] 51 [0] 52 [0] 53 [0] 54 [0] 55 May 27 18:07:19.186786 kernel: pcpu-alloc: [0] 56 [0] 57 [0] 58 [0] 59 [0] 60 [0] 61 [0] 62 [0] 63 May 27 18:07:19.186793 kernel: pcpu-alloc: [0] 64 [0] 65 [0] 66 [0] 67 [0] 68 [0] 69 [0] 70 [0] 71 May 27 18:07:19.186800 kernel: pcpu-alloc: [0] 72 [0] 73 [0] 74 [0] 75 [0] 76 [0] 77 [0] 78 [0] 79 May 27 18:07:19.186806 kernel: Detected PIPT I-cache on CPU0 May 27 18:07:19.186813 kernel: CPU features: detected: GIC system register CPU interface May 27 18:07:19.186820 kernel: CPU features: detected: Virtualization Host Extensions May 27 18:07:19.186827 kernel: CPU features: detected: Spectre-v4 May 27 18:07:19.186833 kernel: CPU features: detected: Spectre-BHB May 27 18:07:19.186840 kernel: CPU features: kernel page table isolation forced ON by KASLR May 27 18:07:19.186847 kernel: CPU features: detected: Kernel page table isolation (KPTI) May 27 18:07:19.186855 kernel: CPU features: detected: ARM erratum 1418040 May 27 18:07:19.186862 kernel: CPU features: detected: SSBS not fully self-synchronizing May 27 18:07:19.186869 kernel: alternatives: applying boot alternatives May 27 18:07:19.186877 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=packet flatcar.autologin verity.usrhash=4e706b869299e1c88703222069cdfa08c45ebce568f762053eea5b3f5f0939c3 May 27 18:07:19.186884 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 18:07:19.186891 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes May 27 18:07:19.186897 kernel: printk: log_buf_len total cpu_extra contributions: 323584 bytes May 27 18:07:19.186904 kernel: printk: log_buf_len min size: 262144 bytes May 27 18:07:19.186911 kernel: printk: log_buf_len: 1048576 bytes May 27 18:07:19.186918 kernel: printk: early log buf free: 249568(95%) May 27 18:07:19.186925 kernel: Dentry cache hash table entries: 16777216 (order: 15, 134217728 bytes, linear) May 27 18:07:19.186933 kernel: Inode-cache hash table entries: 8388608 (order: 14, 67108864 bytes, linear) May 27 18:07:19.186940 kernel: Fallback order for Node 0: 0 May 27 18:07:19.186947 kernel: Built 1 zonelists, mobility grouping on. Total pages: 67043584 May 27 18:07:19.186954 kernel: Policy zone: Normal May 27 18:07:19.186960 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 18:07:19.186967 kernel: software IO TLB: area num 128. May 27 18:07:19.186974 kernel: software IO TLB: mapped [mem 0x00000000fbc8f000-0x00000000ffc8f000] (64MB) May 27 18:07:19.186981 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=80, Nodes=1 May 27 18:07:19.186988 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 18:07:19.186995 kernel: rcu: RCU event tracing is enabled. May 27 18:07:19.187002 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=80. May 27 18:07:19.187010 kernel: Trampoline variant of Tasks RCU enabled. May 27 18:07:19.187017 kernel: Tracing variant of Tasks RCU enabled. May 27 18:07:19.187024 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 18:07:19.187031 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=80 May 27 18:07:19.187038 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=80. May 27 18:07:19.187045 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=80. May 27 18:07:19.187052 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 27 18:07:19.187059 kernel: GICv3: GIC: Using split EOI/Deactivate mode May 27 18:07:19.187065 kernel: GICv3: 672 SPIs implemented May 27 18:07:19.187072 kernel: GICv3: 0 Extended SPIs implemented May 27 18:07:19.187079 kernel: Root IRQ handler: gic_handle_irq May 27 18:07:19.187086 kernel: GICv3: GICv3 features: 16 PPIs May 27 18:07:19.187094 kernel: GICv3: GICD_CTRL.DS=0, SCR_EL3.FIQ=1 May 27 18:07:19.187101 kernel: GICv3: CPU0: found redistributor 120000 region 0:0x00001001005c0000 May 27 18:07:19.187107 kernel: SRAT: PXM 0 -> ITS 0 -> Node 0 May 27 18:07:19.187114 kernel: SRAT: PXM 0 -> ITS 1 -> Node 0 May 27 18:07:19.187121 kernel: SRAT: PXM 0 -> ITS 2 -> Node 0 May 27 18:07:19.187127 kernel: SRAT: PXM 0 -> ITS 3 -> Node 0 May 27 18:07:19.187134 kernel: SRAT: PXM 0 -> ITS 4 -> Node 0 May 27 18:07:19.187141 kernel: SRAT: PXM 0 -> ITS 5 -> Node 0 May 27 18:07:19.187147 kernel: SRAT: PXM 0 -> ITS 6 -> Node 0 May 27 18:07:19.187154 kernel: SRAT: PXM 0 -> ITS 7 -> Node 0 May 27 18:07:19.187161 kernel: ITS [mem 0x100100040000-0x10010005ffff] May 27 18:07:19.187168 kernel: ITS@0x0000100100040000: allocated 8192 Devices @80000310000 (indirect, esz 8, psz 64K, shr 1) May 27 18:07:19.187176 kernel: ITS@0x0000100100040000: allocated 32768 Interrupt Collections @80000320000 (flat, esz 2, psz 64K, shr 1) May 27 18:07:19.187183 kernel: ITS [mem 0x100100060000-0x10010007ffff] May 27 18:07:19.187190 kernel: ITS@0x0000100100060000: allocated 8192 Devices @80000340000 (indirect, esz 8, psz 64K, shr 1) May 27 18:07:19.187197 kernel: ITS@0x0000100100060000: allocated 32768 Interrupt Collections @80000350000 (flat, esz 2, psz 64K, shr 1) May 27 18:07:19.187203 kernel: ITS [mem 0x100100080000-0x10010009ffff] May 27 18:07:19.187210 kernel: ITS@0x0000100100080000: allocated 8192 Devices @80000370000 (indirect, esz 8, psz 64K, shr 1) May 27 18:07:19.187217 kernel: ITS@0x0000100100080000: allocated 32768 Interrupt Collections @80000380000 (flat, esz 2, psz 64K, shr 1) May 27 18:07:19.187224 kernel: ITS [mem 0x1001000a0000-0x1001000bffff] May 27 18:07:19.187231 kernel: ITS@0x00001001000a0000: allocated 8192 Devices @800003a0000 (indirect, esz 8, psz 64K, shr 1) May 27 18:07:19.187238 kernel: ITS@0x00001001000a0000: allocated 32768 Interrupt Collections @800003b0000 (flat, esz 2, psz 64K, shr 1) May 27 18:07:19.187245 kernel: ITS [mem 0x1001000c0000-0x1001000dffff] May 27 18:07:19.187253 kernel: ITS@0x00001001000c0000: allocated 8192 Devices @800003d0000 (indirect, esz 8, psz 64K, shr 1) May 27 18:07:19.187260 kernel: ITS@0x00001001000c0000: allocated 32768 Interrupt Collections @800003e0000 (flat, esz 2, psz 64K, shr 1) May 27 18:07:19.187267 kernel: ITS [mem 0x1001000e0000-0x1001000fffff] May 27 18:07:19.187274 kernel: ITS@0x00001001000e0000: allocated 8192 Devices @80000800000 (indirect, esz 8, psz 64K, shr 1) May 27 18:07:19.187281 kernel: ITS@0x00001001000e0000: allocated 32768 Interrupt Collections @80000810000 (flat, esz 2, psz 64K, shr 1) May 27 18:07:19.187288 kernel: ITS [mem 0x100100100000-0x10010011ffff] May 27 18:07:19.187294 kernel: ITS@0x0000100100100000: allocated 8192 Devices @80000830000 (indirect, esz 8, psz 64K, shr 1) May 27 18:07:19.187301 kernel: ITS@0x0000100100100000: allocated 32768 Interrupt Collections @80000840000 (flat, esz 2, psz 64K, shr 1) May 27 18:07:19.187308 kernel: ITS [mem 0x100100120000-0x10010013ffff] May 27 18:07:19.187315 kernel: ITS@0x0000100100120000: allocated 8192 Devices @80000860000 (indirect, esz 8, psz 64K, shr 1) May 27 18:07:19.187322 kernel: ITS@0x0000100100120000: allocated 32768 Interrupt Collections @80000870000 (flat, esz 2, psz 64K, shr 1) May 27 18:07:19.187330 kernel: GICv3: using LPI property table @0x0000080000880000 May 27 18:07:19.187337 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000080000890000 May 27 18:07:19.187344 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 18:07:19.187351 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187358 kernel: ACPI GTDT: found 1 memory-mapped timer block(s). May 27 18:07:19.187364 kernel: arch_timer: cp15 and mmio timer(s) running at 25.00MHz (phys/phys). May 27 18:07:19.187371 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns May 27 18:07:19.187378 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns May 27 18:07:19.187385 kernel: Console: colour dummy device 80x25 May 27 18:07:19.187392 kernel: printk: legacy console [tty0] enabled May 27 18:07:19.187399 kernel: ACPI: Core revision 20240827 May 27 18:07:19.187408 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) May 27 18:07:19.187415 kernel: pid_max: default: 81920 minimum: 640 May 27 18:07:19.187422 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 18:07:19.187429 kernel: landlock: Up and running. May 27 18:07:19.187436 kernel: SELinux: Initializing. May 27 18:07:19.187443 kernel: Mount-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 18:07:19.187450 kernel: Mountpoint-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 18:07:19.187457 kernel: rcu: Hierarchical SRCU implementation. May 27 18:07:19.187464 kernel: rcu: Max phase no-delay instances is 400. May 27 18:07:19.187472 kernel: Timer migration: 3 hierarchy levels; 8 children per group; 3 crossnode level May 27 18:07:19.187479 kernel: Remapping and enabling EFI services. May 27 18:07:19.187486 kernel: smp: Bringing up secondary CPUs ... May 27 18:07:19.187493 kernel: Detected PIPT I-cache on CPU1 May 27 18:07:19.187500 kernel: GICv3: CPU1: found redistributor 1a0000 region 0:0x00001001007c0000 May 27 18:07:19.187507 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000800008a0000 May 27 18:07:19.187514 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187521 kernel: CPU1: Booted secondary processor 0x00001a0000 [0x413fd0c1] May 27 18:07:19.187528 kernel: Detected PIPT I-cache on CPU2 May 27 18:07:19.187536 kernel: GICv3: CPU2: found redistributor 140000 region 0:0x0000100100640000 May 27 18:07:19.187543 kernel: GICv3: CPU2: using allocated LPI pending table @0x00000800008b0000 May 27 18:07:19.187550 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187557 kernel: CPU2: Booted secondary processor 0x0000140000 [0x413fd0c1] May 27 18:07:19.187564 kernel: Detected PIPT I-cache on CPU3 May 27 18:07:19.187571 kernel: GICv3: CPU3: found redistributor 1c0000 region 0:0x0000100100840000 May 27 18:07:19.187578 kernel: GICv3: CPU3: using allocated LPI pending table @0x00000800008c0000 May 27 18:07:19.187587 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187594 kernel: CPU3: Booted secondary processor 0x00001c0000 [0x413fd0c1] May 27 18:07:19.187601 kernel: Detected PIPT I-cache on CPU4 May 27 18:07:19.187609 kernel: GICv3: CPU4: found redistributor 100000 region 0:0x0000100100540000 May 27 18:07:19.187616 kernel: GICv3: CPU4: using allocated LPI pending table @0x00000800008d0000 May 27 18:07:19.187623 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187630 kernel: CPU4: Booted secondary processor 0x0000100000 [0x413fd0c1] May 27 18:07:19.187637 kernel: Detected PIPT I-cache on CPU5 May 27 18:07:19.187644 kernel: GICv3: CPU5: found redistributor 180000 region 0:0x0000100100740000 May 27 18:07:19.187651 kernel: GICv3: CPU5: using allocated LPI pending table @0x00000800008e0000 May 27 18:07:19.187658 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187665 kernel: CPU5: Booted secondary processor 0x0000180000 [0x413fd0c1] May 27 18:07:19.187673 kernel: Detected PIPT I-cache on CPU6 May 27 18:07:19.187681 kernel: GICv3: CPU6: found redistributor 160000 region 0:0x00001001006c0000 May 27 18:07:19.187688 kernel: GICv3: CPU6: using allocated LPI pending table @0x00000800008f0000 May 27 18:07:19.187695 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187701 kernel: CPU6: Booted secondary processor 0x0000160000 [0x413fd0c1] May 27 18:07:19.187709 kernel: Detected PIPT I-cache on CPU7 May 27 18:07:19.187716 kernel: GICv3: CPU7: found redistributor 1e0000 region 0:0x00001001008c0000 May 27 18:07:19.187723 kernel: GICv3: CPU7: using allocated LPI pending table @0x0000080000900000 May 27 18:07:19.187730 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187737 kernel: CPU7: Booted secondary processor 0x00001e0000 [0x413fd0c1] May 27 18:07:19.187745 kernel: Detected PIPT I-cache on CPU8 May 27 18:07:19.187752 kernel: GICv3: CPU8: found redistributor a0000 region 0:0x00001001003c0000 May 27 18:07:19.187759 kernel: GICv3: CPU8: using allocated LPI pending table @0x0000080000910000 May 27 18:07:19.187766 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187772 kernel: CPU8: Booted secondary processor 0x00000a0000 [0x413fd0c1] May 27 18:07:19.187779 kernel: Detected PIPT I-cache on CPU9 May 27 18:07:19.187786 kernel: GICv3: CPU9: found redistributor 220000 region 0:0x00001001009c0000 May 27 18:07:19.187793 kernel: GICv3: CPU9: using allocated LPI pending table @0x0000080000920000 May 27 18:07:19.187800 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187808 kernel: CPU9: Booted secondary processor 0x0000220000 [0x413fd0c1] May 27 18:07:19.187815 kernel: Detected PIPT I-cache on CPU10 May 27 18:07:19.187822 kernel: GICv3: CPU10: found redistributor c0000 region 0:0x0000100100440000 May 27 18:07:19.187829 kernel: GICv3: CPU10: using allocated LPI pending table @0x0000080000930000 May 27 18:07:19.187836 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187842 kernel: CPU10: Booted secondary processor 0x00000c0000 [0x413fd0c1] May 27 18:07:19.187849 kernel: Detected PIPT I-cache on CPU11 May 27 18:07:19.187856 kernel: GICv3: CPU11: found redistributor 240000 region 0:0x0000100100a40000 May 27 18:07:19.187863 kernel: GICv3: CPU11: using allocated LPI pending table @0x0000080000940000 May 27 18:07:19.187870 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187878 kernel: CPU11: Booted secondary processor 0x0000240000 [0x413fd0c1] May 27 18:07:19.187885 kernel: Detected PIPT I-cache on CPU12 May 27 18:07:19.187892 kernel: GICv3: CPU12: found redistributor 80000 region 0:0x0000100100340000 May 27 18:07:19.187899 kernel: GICv3: CPU12: using allocated LPI pending table @0x0000080000950000 May 27 18:07:19.187906 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187913 kernel: CPU12: Booted secondary processor 0x0000080000 [0x413fd0c1] May 27 18:07:19.187920 kernel: Detected PIPT I-cache on CPU13 May 27 18:07:19.187927 kernel: GICv3: CPU13: found redistributor 200000 region 0:0x0000100100940000 May 27 18:07:19.187934 kernel: GICv3: CPU13: using allocated LPI pending table @0x0000080000960000 May 27 18:07:19.187942 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187949 kernel: CPU13: Booted secondary processor 0x0000200000 [0x413fd0c1] May 27 18:07:19.187956 kernel: Detected PIPT I-cache on CPU14 May 27 18:07:19.187963 kernel: GICv3: CPU14: found redistributor e0000 region 0:0x00001001004c0000 May 27 18:07:19.187970 kernel: GICv3: CPU14: using allocated LPI pending table @0x0000080000970000 May 27 18:07:19.187977 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.187984 kernel: CPU14: Booted secondary processor 0x00000e0000 [0x413fd0c1] May 27 18:07:19.187991 kernel: Detected PIPT I-cache on CPU15 May 27 18:07:19.187998 kernel: GICv3: CPU15: found redistributor 260000 region 0:0x0000100100ac0000 May 27 18:07:19.188006 kernel: GICv3: CPU15: using allocated LPI pending table @0x0000080000980000 May 27 18:07:19.188013 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188020 kernel: CPU15: Booted secondary processor 0x0000260000 [0x413fd0c1] May 27 18:07:19.188027 kernel: Detected PIPT I-cache on CPU16 May 27 18:07:19.188034 kernel: GICv3: CPU16: found redistributor 20000 region 0:0x00001001001c0000 May 27 18:07:19.188041 kernel: GICv3: CPU16: using allocated LPI pending table @0x0000080000990000 May 27 18:07:19.188048 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188055 kernel: CPU16: Booted secondary processor 0x0000020000 [0x413fd0c1] May 27 18:07:19.188061 kernel: Detected PIPT I-cache on CPU17 May 27 18:07:19.188070 kernel: GICv3: CPU17: found redistributor 40000 region 0:0x0000100100240000 May 27 18:07:19.188077 kernel: GICv3: CPU17: using allocated LPI pending table @0x00000800009a0000 May 27 18:07:19.188084 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188090 kernel: CPU17: Booted secondary processor 0x0000040000 [0x413fd0c1] May 27 18:07:19.188097 kernel: Detected PIPT I-cache on CPU18 May 27 18:07:19.188104 kernel: GICv3: CPU18: found redistributor 0 region 0:0x0000100100140000 May 27 18:07:19.188111 kernel: GICv3: CPU18: using allocated LPI pending table @0x00000800009b0000 May 27 18:07:19.188118 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188125 kernel: CPU18: Booted secondary processor 0x0000000000 [0x413fd0c1] May 27 18:07:19.188141 kernel: Detected PIPT I-cache on CPU19 May 27 18:07:19.188151 kernel: GICv3: CPU19: found redistributor 60000 region 0:0x00001001002c0000 May 27 18:07:19.188158 kernel: GICv3: CPU19: using allocated LPI pending table @0x00000800009c0000 May 27 18:07:19.188166 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188173 kernel: CPU19: Booted secondary processor 0x0000060000 [0x413fd0c1] May 27 18:07:19.188180 kernel: Detected PIPT I-cache on CPU20 May 27 18:07:19.188187 kernel: GICv3: CPU20: found redistributor 130000 region 0:0x0000100100600000 May 27 18:07:19.188195 kernel: GICv3: CPU20: using allocated LPI pending table @0x00000800009d0000 May 27 18:07:19.188202 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188209 kernel: CPU20: Booted secondary processor 0x0000130000 [0x413fd0c1] May 27 18:07:19.188217 kernel: Detected PIPT I-cache on CPU21 May 27 18:07:19.188225 kernel: GICv3: CPU21: found redistributor 1b0000 region 0:0x0000100100800000 May 27 18:07:19.188232 kernel: GICv3: CPU21: using allocated LPI pending table @0x00000800009e0000 May 27 18:07:19.188239 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188246 kernel: CPU21: Booted secondary processor 0x00001b0000 [0x413fd0c1] May 27 18:07:19.188253 kernel: Detected PIPT I-cache on CPU22 May 27 18:07:19.188261 kernel: GICv3: CPU22: found redistributor 150000 region 0:0x0000100100680000 May 27 18:07:19.188269 kernel: GICv3: CPU22: using allocated LPI pending table @0x00000800009f0000 May 27 18:07:19.188277 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188284 kernel: CPU22: Booted secondary processor 0x0000150000 [0x413fd0c1] May 27 18:07:19.188291 kernel: Detected PIPT I-cache on CPU23 May 27 18:07:19.188298 kernel: GICv3: CPU23: found redistributor 1d0000 region 0:0x0000100100880000 May 27 18:07:19.188306 kernel: GICv3: CPU23: using allocated LPI pending table @0x0000080000a00000 May 27 18:07:19.188313 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188320 kernel: CPU23: Booted secondary processor 0x00001d0000 [0x413fd0c1] May 27 18:07:19.188327 kernel: Detected PIPT I-cache on CPU24 May 27 18:07:19.188336 kernel: GICv3: CPU24: found redistributor 110000 region 0:0x0000100100580000 May 27 18:07:19.188343 kernel: GICv3: CPU24: using allocated LPI pending table @0x0000080000a10000 May 27 18:07:19.188351 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188358 kernel: CPU24: Booted secondary processor 0x0000110000 [0x413fd0c1] May 27 18:07:19.188365 kernel: Detected PIPT I-cache on CPU25 May 27 18:07:19.188372 kernel: GICv3: CPU25: found redistributor 190000 region 0:0x0000100100780000 May 27 18:07:19.188380 kernel: GICv3: CPU25: using allocated LPI pending table @0x0000080000a20000 May 27 18:07:19.188387 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188394 kernel: CPU25: Booted secondary processor 0x0000190000 [0x413fd0c1] May 27 18:07:19.188403 kernel: Detected PIPT I-cache on CPU26 May 27 18:07:19.188410 kernel: GICv3: CPU26: found redistributor 170000 region 0:0x0000100100700000 May 27 18:07:19.188418 kernel: GICv3: CPU26: using allocated LPI pending table @0x0000080000a30000 May 27 18:07:19.188425 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188432 kernel: CPU26: Booted secondary processor 0x0000170000 [0x413fd0c1] May 27 18:07:19.188440 kernel: Detected PIPT I-cache on CPU27 May 27 18:07:19.188447 kernel: GICv3: CPU27: found redistributor 1f0000 region 0:0x0000100100900000 May 27 18:07:19.188454 kernel: GICv3: CPU27: using allocated LPI pending table @0x0000080000a40000 May 27 18:07:19.188461 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188469 kernel: CPU27: Booted secondary processor 0x00001f0000 [0x413fd0c1] May 27 18:07:19.188477 kernel: Detected PIPT I-cache on CPU28 May 27 18:07:19.188484 kernel: GICv3: CPU28: found redistributor b0000 region 0:0x0000100100400000 May 27 18:07:19.188492 kernel: GICv3: CPU28: using allocated LPI pending table @0x0000080000a50000 May 27 18:07:19.188499 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188506 kernel: CPU28: Booted secondary processor 0x00000b0000 [0x413fd0c1] May 27 18:07:19.188513 kernel: Detected PIPT I-cache on CPU29 May 27 18:07:19.188521 kernel: GICv3: CPU29: found redistributor 230000 region 0:0x0000100100a00000 May 27 18:07:19.188528 kernel: GICv3: CPU29: using allocated LPI pending table @0x0000080000a60000 May 27 18:07:19.188535 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188544 kernel: CPU29: Booted secondary processor 0x0000230000 [0x413fd0c1] May 27 18:07:19.188551 kernel: Detected PIPT I-cache on CPU30 May 27 18:07:19.188559 kernel: GICv3: CPU30: found redistributor d0000 region 0:0x0000100100480000 May 27 18:07:19.188566 kernel: GICv3: CPU30: using allocated LPI pending table @0x0000080000a70000 May 27 18:07:19.188573 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188583 kernel: CPU30: Booted secondary processor 0x00000d0000 [0x413fd0c1] May 27 18:07:19.188590 kernel: Detected PIPT I-cache on CPU31 May 27 18:07:19.188598 kernel: GICv3: CPU31: found redistributor 250000 region 0:0x0000100100a80000 May 27 18:07:19.188605 kernel: GICv3: CPU31: using allocated LPI pending table @0x0000080000a80000 May 27 18:07:19.188614 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188621 kernel: CPU31: Booted secondary processor 0x0000250000 [0x413fd0c1] May 27 18:07:19.188629 kernel: Detected PIPT I-cache on CPU32 May 27 18:07:19.188636 kernel: GICv3: CPU32: found redistributor 90000 region 0:0x0000100100380000 May 27 18:07:19.188644 kernel: GICv3: CPU32: using allocated LPI pending table @0x0000080000a90000 May 27 18:07:19.188651 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188659 kernel: CPU32: Booted secondary processor 0x0000090000 [0x413fd0c1] May 27 18:07:19.188666 kernel: Detected PIPT I-cache on CPU33 May 27 18:07:19.188673 kernel: GICv3: CPU33: found redistributor 210000 region 0:0x0000100100980000 May 27 18:07:19.188681 kernel: GICv3: CPU33: using allocated LPI pending table @0x0000080000aa0000 May 27 18:07:19.188689 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188696 kernel: CPU33: Booted secondary processor 0x0000210000 [0x413fd0c1] May 27 18:07:19.188704 kernel: Detected PIPT I-cache on CPU34 May 27 18:07:19.188711 kernel: GICv3: CPU34: found redistributor f0000 region 0:0x0000100100500000 May 27 18:07:19.188718 kernel: GICv3: CPU34: using allocated LPI pending table @0x0000080000ab0000 May 27 18:07:19.188727 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188734 kernel: CPU34: Booted secondary processor 0x00000f0000 [0x413fd0c1] May 27 18:07:19.188743 kernel: Detected PIPT I-cache on CPU35 May 27 18:07:19.188750 kernel: GICv3: CPU35: found redistributor 270000 region 0:0x0000100100b00000 May 27 18:07:19.188759 kernel: GICv3: CPU35: using allocated LPI pending table @0x0000080000ac0000 May 27 18:07:19.188766 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188773 kernel: CPU35: Booted secondary processor 0x0000270000 [0x413fd0c1] May 27 18:07:19.188781 kernel: Detected PIPT I-cache on CPU36 May 27 18:07:19.188788 kernel: GICv3: CPU36: found redistributor 30000 region 0:0x0000100100200000 May 27 18:07:19.188795 kernel: GICv3: CPU36: using allocated LPI pending table @0x0000080000ad0000 May 27 18:07:19.188803 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188810 kernel: CPU36: Booted secondary processor 0x0000030000 [0x413fd0c1] May 27 18:07:19.188817 kernel: Detected PIPT I-cache on CPU37 May 27 18:07:19.188824 kernel: GICv3: CPU37: found redistributor 50000 region 0:0x0000100100280000 May 27 18:07:19.188833 kernel: GICv3: CPU37: using allocated LPI pending table @0x0000080000ae0000 May 27 18:07:19.188840 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188847 kernel: CPU37: Booted secondary processor 0x0000050000 [0x413fd0c1] May 27 18:07:19.188855 kernel: Detected PIPT I-cache on CPU38 May 27 18:07:19.188862 kernel: GICv3: CPU38: found redistributor 10000 region 0:0x0000100100180000 May 27 18:07:19.188869 kernel: GICv3: CPU38: using allocated LPI pending table @0x0000080000af0000 May 27 18:07:19.188877 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188884 kernel: CPU38: Booted secondary processor 0x0000010000 [0x413fd0c1] May 27 18:07:19.188891 kernel: Detected PIPT I-cache on CPU39 May 27 18:07:19.188900 kernel: GICv3: CPU39: found redistributor 70000 region 0:0x0000100100300000 May 27 18:07:19.188907 kernel: GICv3: CPU39: using allocated LPI pending table @0x0000080000b00000 May 27 18:07:19.188914 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188922 kernel: CPU39: Booted secondary processor 0x0000070000 [0x413fd0c1] May 27 18:07:19.188929 kernel: Detected PIPT I-cache on CPU40 May 27 18:07:19.188936 kernel: GICv3: CPU40: found redistributor 120100 region 0:0x00001001005e0000 May 27 18:07:19.188944 kernel: GICv3: CPU40: using allocated LPI pending table @0x0000080000b10000 May 27 18:07:19.188951 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188960 kernel: CPU40: Booted secondary processor 0x0000120100 [0x413fd0c1] May 27 18:07:19.188967 kernel: Detected PIPT I-cache on CPU41 May 27 18:07:19.188975 kernel: GICv3: CPU41: found redistributor 1a0100 region 0:0x00001001007e0000 May 27 18:07:19.188982 kernel: GICv3: CPU41: using allocated LPI pending table @0x0000080000b20000 May 27 18:07:19.188989 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.188997 kernel: CPU41: Booted secondary processor 0x00001a0100 [0x413fd0c1] May 27 18:07:19.189004 kernel: Detected PIPT I-cache on CPU42 May 27 18:07:19.189011 kernel: GICv3: CPU42: found redistributor 140100 region 0:0x0000100100660000 May 27 18:07:19.189019 kernel: GICv3: CPU42: using allocated LPI pending table @0x0000080000b30000 May 27 18:07:19.189027 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189035 kernel: CPU42: Booted secondary processor 0x0000140100 [0x413fd0c1] May 27 18:07:19.189042 kernel: Detected PIPT I-cache on CPU43 May 27 18:07:19.189049 kernel: GICv3: CPU43: found redistributor 1c0100 region 0:0x0000100100860000 May 27 18:07:19.189057 kernel: GICv3: CPU43: using allocated LPI pending table @0x0000080000b40000 May 27 18:07:19.189064 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189071 kernel: CPU43: Booted secondary processor 0x00001c0100 [0x413fd0c1] May 27 18:07:19.189078 kernel: Detected PIPT I-cache on CPU44 May 27 18:07:19.189086 kernel: GICv3: CPU44: found redistributor 100100 region 0:0x0000100100560000 May 27 18:07:19.189093 kernel: GICv3: CPU44: using allocated LPI pending table @0x0000080000b50000 May 27 18:07:19.189102 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189109 kernel: CPU44: Booted secondary processor 0x0000100100 [0x413fd0c1] May 27 18:07:19.189117 kernel: Detected PIPT I-cache on CPU45 May 27 18:07:19.189124 kernel: GICv3: CPU45: found redistributor 180100 region 0:0x0000100100760000 May 27 18:07:19.189131 kernel: GICv3: CPU45: using allocated LPI pending table @0x0000080000b60000 May 27 18:07:19.189139 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189146 kernel: CPU45: Booted secondary processor 0x0000180100 [0x413fd0c1] May 27 18:07:19.189153 kernel: Detected PIPT I-cache on CPU46 May 27 18:07:19.189161 kernel: GICv3: CPU46: found redistributor 160100 region 0:0x00001001006e0000 May 27 18:07:19.189169 kernel: GICv3: CPU46: using allocated LPI pending table @0x0000080000b70000 May 27 18:07:19.189177 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189184 kernel: CPU46: Booted secondary processor 0x0000160100 [0x413fd0c1] May 27 18:07:19.189191 kernel: Detected PIPT I-cache on CPU47 May 27 18:07:19.189199 kernel: GICv3: CPU47: found redistributor 1e0100 region 0:0x00001001008e0000 May 27 18:07:19.189206 kernel: GICv3: CPU47: using allocated LPI pending table @0x0000080000b80000 May 27 18:07:19.189213 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189221 kernel: CPU47: Booted secondary processor 0x00001e0100 [0x413fd0c1] May 27 18:07:19.189228 kernel: Detected PIPT I-cache on CPU48 May 27 18:07:19.189235 kernel: GICv3: CPU48: found redistributor a0100 region 0:0x00001001003e0000 May 27 18:07:19.189245 kernel: GICv3: CPU48: using allocated LPI pending table @0x0000080000b90000 May 27 18:07:19.189252 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189260 kernel: CPU48: Booted secondary processor 0x00000a0100 [0x413fd0c1] May 27 18:07:19.189267 kernel: Detected PIPT I-cache on CPU49 May 27 18:07:19.189274 kernel: GICv3: CPU49: found redistributor 220100 region 0:0x00001001009e0000 May 27 18:07:19.189282 kernel: GICv3: CPU49: using allocated LPI pending table @0x0000080000ba0000 May 27 18:07:19.189289 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189296 kernel: CPU49: Booted secondary processor 0x0000220100 [0x413fd0c1] May 27 18:07:19.189303 kernel: Detected PIPT I-cache on CPU50 May 27 18:07:19.189312 kernel: GICv3: CPU50: found redistributor c0100 region 0:0x0000100100460000 May 27 18:07:19.189320 kernel: GICv3: CPU50: using allocated LPI pending table @0x0000080000bb0000 May 27 18:07:19.189327 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189334 kernel: CPU50: Booted secondary processor 0x00000c0100 [0x413fd0c1] May 27 18:07:19.189341 kernel: Detected PIPT I-cache on CPU51 May 27 18:07:19.189349 kernel: GICv3: CPU51: found redistributor 240100 region 0:0x0000100100a60000 May 27 18:07:19.189356 kernel: GICv3: CPU51: using allocated LPI pending table @0x0000080000bc0000 May 27 18:07:19.189363 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189371 kernel: CPU51: Booted secondary processor 0x0000240100 [0x413fd0c1] May 27 18:07:19.189378 kernel: Detected PIPT I-cache on CPU52 May 27 18:07:19.189386 kernel: GICv3: CPU52: found redistributor 80100 region 0:0x0000100100360000 May 27 18:07:19.189394 kernel: GICv3: CPU52: using allocated LPI pending table @0x0000080000bd0000 May 27 18:07:19.189401 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189408 kernel: CPU52: Booted secondary processor 0x0000080100 [0x413fd0c1] May 27 18:07:19.189416 kernel: Detected PIPT I-cache on CPU53 May 27 18:07:19.189423 kernel: GICv3: CPU53: found redistributor 200100 region 0:0x0000100100960000 May 27 18:07:19.189431 kernel: GICv3: CPU53: using allocated LPI pending table @0x0000080000be0000 May 27 18:07:19.189438 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189446 kernel: CPU53: Booted secondary processor 0x0000200100 [0x413fd0c1] May 27 18:07:19.189454 kernel: Detected PIPT I-cache on CPU54 May 27 18:07:19.189462 kernel: GICv3: CPU54: found redistributor e0100 region 0:0x00001001004e0000 May 27 18:07:19.189469 kernel: GICv3: CPU54: using allocated LPI pending table @0x0000080000bf0000 May 27 18:07:19.189476 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189483 kernel: CPU54: Booted secondary processor 0x00000e0100 [0x413fd0c1] May 27 18:07:19.189491 kernel: Detected PIPT I-cache on CPU55 May 27 18:07:19.189498 kernel: GICv3: CPU55: found redistributor 260100 region 0:0x0000100100ae0000 May 27 18:07:19.189505 kernel: GICv3: CPU55: using allocated LPI pending table @0x0000080000c00000 May 27 18:07:19.189513 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189520 kernel: CPU55: Booted secondary processor 0x0000260100 [0x413fd0c1] May 27 18:07:19.189529 kernel: Detected PIPT I-cache on CPU56 May 27 18:07:19.189537 kernel: GICv3: CPU56: found redistributor 20100 region 0:0x00001001001e0000 May 27 18:07:19.189545 kernel: GICv3: CPU56: using allocated LPI pending table @0x0000080000c10000 May 27 18:07:19.189552 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189559 kernel: CPU56: Booted secondary processor 0x0000020100 [0x413fd0c1] May 27 18:07:19.189567 kernel: Detected PIPT I-cache on CPU57 May 27 18:07:19.189574 kernel: GICv3: CPU57: found redistributor 40100 region 0:0x0000100100260000 May 27 18:07:19.189584 kernel: GICv3: CPU57: using allocated LPI pending table @0x0000080000c20000 May 27 18:07:19.189591 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189600 kernel: CPU57: Booted secondary processor 0x0000040100 [0x413fd0c1] May 27 18:07:19.189607 kernel: Detected PIPT I-cache on CPU58 May 27 18:07:19.189615 kernel: GICv3: CPU58: found redistributor 100 region 0:0x0000100100160000 May 27 18:07:19.189622 kernel: GICv3: CPU58: using allocated LPI pending table @0x0000080000c30000 May 27 18:07:19.189629 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189637 kernel: CPU58: Booted secondary processor 0x0000000100 [0x413fd0c1] May 27 18:07:19.189644 kernel: Detected PIPT I-cache on CPU59 May 27 18:07:19.189651 kernel: GICv3: CPU59: found redistributor 60100 region 0:0x00001001002e0000 May 27 18:07:19.189659 kernel: GICv3: CPU59: using allocated LPI pending table @0x0000080000c40000 May 27 18:07:19.189667 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189675 kernel: CPU59: Booted secondary processor 0x0000060100 [0x413fd0c1] May 27 18:07:19.189682 kernel: Detected PIPT I-cache on CPU60 May 27 18:07:19.189689 kernel: GICv3: CPU60: found redistributor 130100 region 0:0x0000100100620000 May 27 18:07:19.189697 kernel: GICv3: CPU60: using allocated LPI pending table @0x0000080000c50000 May 27 18:07:19.189704 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189711 kernel: CPU60: Booted secondary processor 0x0000130100 [0x413fd0c1] May 27 18:07:19.189718 kernel: Detected PIPT I-cache on CPU61 May 27 18:07:19.189726 kernel: GICv3: CPU61: found redistributor 1b0100 region 0:0x0000100100820000 May 27 18:07:19.189733 kernel: GICv3: CPU61: using allocated LPI pending table @0x0000080000c60000 May 27 18:07:19.189742 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189749 kernel: CPU61: Booted secondary processor 0x00001b0100 [0x413fd0c1] May 27 18:07:19.189757 kernel: Detected PIPT I-cache on CPU62 May 27 18:07:19.189764 kernel: GICv3: CPU62: found redistributor 150100 region 0:0x00001001006a0000 May 27 18:07:19.189771 kernel: GICv3: CPU62: using allocated LPI pending table @0x0000080000c70000 May 27 18:07:19.189779 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189786 kernel: CPU62: Booted secondary processor 0x0000150100 [0x413fd0c1] May 27 18:07:19.189793 kernel: Detected PIPT I-cache on CPU63 May 27 18:07:19.189801 kernel: GICv3: CPU63: found redistributor 1d0100 region 0:0x00001001008a0000 May 27 18:07:19.189809 kernel: GICv3: CPU63: using allocated LPI pending table @0x0000080000c80000 May 27 18:07:19.189816 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189824 kernel: CPU63: Booted secondary processor 0x00001d0100 [0x413fd0c1] May 27 18:07:19.189831 kernel: Detected PIPT I-cache on CPU64 May 27 18:07:19.189838 kernel: GICv3: CPU64: found redistributor 110100 region 0:0x00001001005a0000 May 27 18:07:19.189846 kernel: GICv3: CPU64: using allocated LPI pending table @0x0000080000c90000 May 27 18:07:19.189853 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189860 kernel: CPU64: Booted secondary processor 0x0000110100 [0x413fd0c1] May 27 18:07:19.189868 kernel: Detected PIPT I-cache on CPU65 May 27 18:07:19.189875 kernel: GICv3: CPU65: found redistributor 190100 region 0:0x00001001007a0000 May 27 18:07:19.189883 kernel: GICv3: CPU65: using allocated LPI pending table @0x0000080000ca0000 May 27 18:07:19.189891 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189898 kernel: CPU65: Booted secondary processor 0x0000190100 [0x413fd0c1] May 27 18:07:19.189905 kernel: Detected PIPT I-cache on CPU66 May 27 18:07:19.189913 kernel: GICv3: CPU66: found redistributor 170100 region 0:0x0000100100720000 May 27 18:07:19.189921 kernel: GICv3: CPU66: using allocated LPI pending table @0x0000080000cb0000 May 27 18:07:19.189928 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189935 kernel: CPU66: Booted secondary processor 0x0000170100 [0x413fd0c1] May 27 18:07:19.189942 kernel: Detected PIPT I-cache on CPU67 May 27 18:07:19.189951 kernel: GICv3: CPU67: found redistributor 1f0100 region 0:0x0000100100920000 May 27 18:07:19.189959 kernel: GICv3: CPU67: using allocated LPI pending table @0x0000080000cc0000 May 27 18:07:19.189966 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.189973 kernel: CPU67: Booted secondary processor 0x00001f0100 [0x413fd0c1] May 27 18:07:19.189980 kernel: Detected PIPT I-cache on CPU68 May 27 18:07:19.189988 kernel: GICv3: CPU68: found redistributor b0100 region 0:0x0000100100420000 May 27 18:07:19.189995 kernel: GICv3: CPU68: using allocated LPI pending table @0x0000080000cd0000 May 27 18:07:19.190002 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190009 kernel: CPU68: Booted secondary processor 0x00000b0100 [0x413fd0c1] May 27 18:07:19.190017 kernel: Detected PIPT I-cache on CPU69 May 27 18:07:19.190025 kernel: GICv3: CPU69: found redistributor 230100 region 0:0x0000100100a20000 May 27 18:07:19.190032 kernel: GICv3: CPU69: using allocated LPI pending table @0x0000080000ce0000 May 27 18:07:19.190040 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190047 kernel: CPU69: Booted secondary processor 0x0000230100 [0x413fd0c1] May 27 18:07:19.190054 kernel: Detected PIPT I-cache on CPU70 May 27 18:07:19.190061 kernel: GICv3: CPU70: found redistributor d0100 region 0:0x00001001004a0000 May 27 18:07:19.190069 kernel: GICv3: CPU70: using allocated LPI pending table @0x0000080000cf0000 May 27 18:07:19.190076 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190083 kernel: CPU70: Booted secondary processor 0x00000d0100 [0x413fd0c1] May 27 18:07:19.190092 kernel: Detected PIPT I-cache on CPU71 May 27 18:07:19.190099 kernel: GICv3: CPU71: found redistributor 250100 region 0:0x0000100100aa0000 May 27 18:07:19.190107 kernel: GICv3: CPU71: using allocated LPI pending table @0x0000080000d00000 May 27 18:07:19.190114 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190121 kernel: CPU71: Booted secondary processor 0x0000250100 [0x413fd0c1] May 27 18:07:19.190129 kernel: Detected PIPT I-cache on CPU72 May 27 18:07:19.190136 kernel: GICv3: CPU72: found redistributor 90100 region 0:0x00001001003a0000 May 27 18:07:19.190143 kernel: GICv3: CPU72: using allocated LPI pending table @0x0000080000d10000 May 27 18:07:19.190151 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190158 kernel: CPU72: Booted secondary processor 0x0000090100 [0x413fd0c1] May 27 18:07:19.190166 kernel: Detected PIPT I-cache on CPU73 May 27 18:07:19.190174 kernel: GICv3: CPU73: found redistributor 210100 region 0:0x00001001009a0000 May 27 18:07:19.190181 kernel: GICv3: CPU73: using allocated LPI pending table @0x0000080000d20000 May 27 18:07:19.190189 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190196 kernel: CPU73: Booted secondary processor 0x0000210100 [0x413fd0c1] May 27 18:07:19.190203 kernel: Detected PIPT I-cache on CPU74 May 27 18:07:19.190210 kernel: GICv3: CPU74: found redistributor f0100 region 0:0x0000100100520000 May 27 18:07:19.190218 kernel: GICv3: CPU74: using allocated LPI pending table @0x0000080000d30000 May 27 18:07:19.190225 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190234 kernel: CPU74: Booted secondary processor 0x00000f0100 [0x413fd0c1] May 27 18:07:19.190241 kernel: Detected PIPT I-cache on CPU75 May 27 18:07:19.190248 kernel: GICv3: CPU75: found redistributor 270100 region 0:0x0000100100b20000 May 27 18:07:19.190255 kernel: GICv3: CPU75: using allocated LPI pending table @0x0000080000d40000 May 27 18:07:19.190263 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190270 kernel: CPU75: Booted secondary processor 0x0000270100 [0x413fd0c1] May 27 18:07:19.190277 kernel: Detected PIPT I-cache on CPU76 May 27 18:07:19.190284 kernel: GICv3: CPU76: found redistributor 30100 region 0:0x0000100100220000 May 27 18:07:19.190292 kernel: GICv3: CPU76: using allocated LPI pending table @0x0000080000d50000 May 27 18:07:19.190300 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190307 kernel: CPU76: Booted secondary processor 0x0000030100 [0x413fd0c1] May 27 18:07:19.190315 kernel: Detected PIPT I-cache on CPU77 May 27 18:07:19.190322 kernel: GICv3: CPU77: found redistributor 50100 region 0:0x00001001002a0000 May 27 18:07:19.190329 kernel: GICv3: CPU77: using allocated LPI pending table @0x0000080000d60000 May 27 18:07:19.190336 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190344 kernel: CPU77: Booted secondary processor 0x0000050100 [0x413fd0c1] May 27 18:07:19.190351 kernel: Detected PIPT I-cache on CPU78 May 27 18:07:19.190358 kernel: GICv3: CPU78: found redistributor 10100 region 0:0x00001001001a0000 May 27 18:07:19.190366 kernel: GICv3: CPU78: using allocated LPI pending table @0x0000080000d70000 May 27 18:07:19.190374 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190381 kernel: CPU78: Booted secondary processor 0x0000010100 [0x413fd0c1] May 27 18:07:19.190389 kernel: Detected PIPT I-cache on CPU79 May 27 18:07:19.190396 kernel: GICv3: CPU79: found redistributor 70100 region 0:0x0000100100320000 May 27 18:07:19.190403 kernel: GICv3: CPU79: using allocated LPI pending table @0x0000080000d80000 May 27 18:07:19.190411 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 27 18:07:19.190418 kernel: CPU79: Booted secondary processor 0x0000070100 [0x413fd0c1] May 27 18:07:19.190425 kernel: smp: Brought up 1 node, 80 CPUs May 27 18:07:19.190432 kernel: SMP: Total of 80 processors activated. May 27 18:07:19.190441 kernel: CPU: All CPU(s) started at EL2 May 27 18:07:19.190448 kernel: CPU features: detected: 32-bit EL0 Support May 27 18:07:19.190455 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence May 27 18:07:19.190463 kernel: CPU features: detected: Common not Private translations May 27 18:07:19.190470 kernel: CPU features: detected: CRC32 instructions May 27 18:07:19.190477 kernel: CPU features: detected: Enhanced Virtualization Traps May 27 18:07:19.190485 kernel: CPU features: detected: RCpc load-acquire (LDAPR) May 27 18:07:19.190492 kernel: CPU features: detected: LSE atomic instructions May 27 18:07:19.190499 kernel: CPU features: detected: Privileged Access Never May 27 18:07:19.190507 kernel: CPU features: detected: RAS Extension Support May 27 18:07:19.190515 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) May 27 18:07:19.190523 kernel: alternatives: applying system-wide alternatives May 27 18:07:19.190530 kernel: CPU features: detected: Hardware dirty bit management on CPU0-79 May 27 18:07:19.190538 kernel: Memory: 262860004K/268174336K available (11072K kernel code, 2276K rwdata, 8936K rodata, 39424K init, 1034K bss, 5254404K reserved, 0K cma-reserved) May 27 18:07:19.190545 kernel: devtmpfs: initialized May 27 18:07:19.190553 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 18:07:19.190560 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) May 27 18:07:19.190567 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL May 27 18:07:19.190576 kernel: 0 pages in range for non-PLT usage May 27 18:07:19.190586 kernel: 508544 pages in range for PLT usage May 27 18:07:19.190593 kernel: pinctrl core: initialized pinctrl subsystem May 27 18:07:19.190600 kernel: SMBIOS 3.4.0 present. May 27 18:07:19.190608 kernel: DMI: GIGABYTE R272-P30-JG/MP32-AR0-JG, BIOS F17a (SCP: 1.07.20210713) 07/22/2021 May 27 18:07:19.190615 kernel: DMI: Memory slots populated: 8/16 May 27 18:07:19.190623 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 18:07:19.190630 kernel: DMA: preallocated 4096 KiB GFP_KERNEL pool for atomic allocations May 27 18:07:19.190638 kernel: DMA: preallocated 4096 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 27 18:07:19.190647 kernel: DMA: preallocated 4096 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 27 18:07:19.190654 kernel: audit: initializing netlink subsys (disabled) May 27 18:07:19.190662 kernel: audit: type=2000 audit(0.167:1): state=initialized audit_enabled=0 res=1 May 27 18:07:19.190669 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 18:07:19.190676 kernel: cpuidle: using governor menu May 27 18:07:19.190683 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 27 18:07:19.190691 kernel: ASID allocator initialised with 32768 entries May 27 18:07:19.190698 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 18:07:19.190705 kernel: Serial: AMBA PL011 UART driver May 27 18:07:19.190714 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 18:07:19.190722 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 27 18:07:19.190729 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 27 18:07:19.190737 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 27 18:07:19.190744 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 18:07:19.190752 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 27 18:07:19.190759 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 27 18:07:19.190766 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 27 18:07:19.190774 kernel: ACPI: Added _OSI(Module Device) May 27 18:07:19.190782 kernel: ACPI: Added _OSI(Processor Device) May 27 18:07:19.190790 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 18:07:19.190797 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 18:07:19.190804 kernel: ACPI: 2 ACPI AML tables successfully acquired and loaded May 27 18:07:19.190811 kernel: ACPI: Interpreter enabled May 27 18:07:19.190819 kernel: ACPI: Using GIC for interrupt routing May 27 18:07:19.190826 kernel: ACPI: MCFG table detected, 8 entries May 27 18:07:19.190833 kernel: ACPI: IORT: SMMU-v3[33ffe0000000] Mapped to Proximity domain 0 May 27 18:07:19.190841 kernel: ACPI: IORT: SMMU-v3[37ffe0000000] Mapped to Proximity domain 0 May 27 18:07:19.190849 kernel: ACPI: IORT: SMMU-v3[3bffe0000000] Mapped to Proximity domain 0 May 27 18:07:19.190857 kernel: ACPI: IORT: SMMU-v3[3fffe0000000] Mapped to Proximity domain 0 May 27 18:07:19.190864 kernel: ACPI: IORT: SMMU-v3[23ffe0000000] Mapped to Proximity domain 0 May 27 18:07:19.190871 kernel: ACPI: IORT: SMMU-v3[27ffe0000000] Mapped to Proximity domain 0 May 27 18:07:19.190879 kernel: ACPI: IORT: SMMU-v3[2bffe0000000] Mapped to Proximity domain 0 May 27 18:07:19.190886 kernel: ACPI: IORT: SMMU-v3[2fffe0000000] Mapped to Proximity domain 0 May 27 18:07:19.190893 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x100002600000 (irq = 19, base_baud = 0) is a SBSA May 27 18:07:19.190901 kernel: printk: legacy console [ttyAMA0] enabled May 27 18:07:19.190908 kernel: ARMH0011:01: ttyAMA1 at MMIO 0x100002620000 (irq = 20, base_baud = 0) is a SBSA May 27 18:07:19.190917 kernel: ACPI: PCI Root Bridge [PCI1] (domain 000d [bus 00-ff]) May 27 18:07:19.191046 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 18:07:19.191113 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 18:07:19.191174 kernel: acpi PNP0A08:00: _OSC: OS now controls [AER PCIeCapability] May 27 18:07:19.191234 kernel: acpi PNP0A08:00: MCFG quirk: ECAM at [mem 0x37fff0000000-0x37ffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 18:07:19.191292 kernel: acpi PNP0A08:00: ECAM area [mem 0x37fff0000000-0x37ffffffffff] reserved by PNP0C02:00 May 27 18:07:19.191350 kernel: acpi PNP0A08:00: ECAM at [mem 0x37fff0000000-0x37ffffffffff] for [bus 00-ff] May 27 18:07:19.191362 kernel: PCI host bridge to bus 000d:00 May 27 18:07:19.191429 kernel: pci_bus 000d:00: root bus resource [mem 0x50000000-0x5fffffff window] May 27 18:07:19.191485 kernel: pci_bus 000d:00: root bus resource [mem 0x340000000000-0x37ffdfffffff window] May 27 18:07:19.191539 kernel: pci_bus 000d:00: root bus resource [bus 00-ff] May 27 18:07:19.191624 kernel: pci 000d:00:00.0: [1def:e100] type 00 class 0x060000 conventional PCI endpoint May 27 18:07:19.191697 kernel: pci 000d:00:01.0: [1def:e101] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.191763 kernel: pci 000d:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.191827 kernel: pci 000d:00:01.0: enabling Extended Tags May 27 18:07:19.191891 kernel: pci 000d:00:01.0: supports D1 D2 May 27 18:07:19.191954 kernel: pci 000d:00:01.0: PME# supported from D0 D1 D3hot May 27 18:07:19.192025 kernel: pci 000d:00:02.0: [1def:e102] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.192087 kernel: pci 000d:00:02.0: PCI bridge to [bus 02] May 27 18:07:19.192148 kernel: pci 000d:00:02.0: supports D1 D2 May 27 18:07:19.192210 kernel: pci 000d:00:02.0: PME# supported from D0 D1 D3hot May 27 18:07:19.192279 kernel: pci 000d:00:03.0: [1def:e103] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.192340 kernel: pci 000d:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.192401 kernel: pci 000d:00:03.0: supports D1 D2 May 27 18:07:19.192462 kernel: pci 000d:00:03.0: PME# supported from D0 D1 D3hot May 27 18:07:19.192531 kernel: pci 000d:00:04.0: [1def:e104] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.192597 kernel: pci 000d:00:04.0: PCI bridge to [bus 04] May 27 18:07:19.192661 kernel: pci 000d:00:04.0: supports D1 D2 May 27 18:07:19.192722 kernel: pci 000d:00:04.0: PME# supported from D0 D1 D3hot May 27 18:07:19.192731 kernel: acpiphp: Slot [1] registered May 27 18:07:19.192739 kernel: acpiphp: Slot [2] registered May 27 18:07:19.192746 kernel: acpiphp: Slot [3] registered May 27 18:07:19.192753 kernel: acpiphp: Slot [4] registered May 27 18:07:19.192807 kernel: pci_bus 000d:00: on NUMA node 0 May 27 18:07:19.192869 kernel: pci 000d:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 18:07:19.192932 kernel: pci 000d:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.192993 kernel: pci 000d:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.193055 kernel: pci 000d:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 18:07:19.193116 kernel: pci 000d:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.193178 kernel: pci 000d:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.193239 kernel: pci 000d:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 18:07:19.193301 kernel: pci 000d:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.193365 kernel: pci 000d:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.193426 kernel: pci 000d:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 18:07:19.193486 kernel: pci 000d:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.193547 kernel: pci 000d:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.193613 kernel: pci 000d:00:01.0: bridge window [mem 0x50000000-0x501fffff]: assigned May 27 18:07:19.193675 kernel: pci 000d:00:01.0: bridge window [mem 0x340000000000-0x3400001fffff 64bit pref]: assigned May 27 18:07:19.193735 kernel: pci 000d:00:02.0: bridge window [mem 0x50200000-0x503fffff]: assigned May 27 18:07:19.193798 kernel: pci 000d:00:02.0: bridge window [mem 0x340000200000-0x3400003fffff 64bit pref]: assigned May 27 18:07:19.193858 kernel: pci 000d:00:03.0: bridge window [mem 0x50400000-0x505fffff]: assigned May 27 18:07:19.193919 kernel: pci 000d:00:03.0: bridge window [mem 0x340000400000-0x3400005fffff 64bit pref]: assigned May 27 18:07:19.193979 kernel: pci 000d:00:04.0: bridge window [mem 0x50600000-0x507fffff]: assigned May 27 18:07:19.194040 kernel: pci 000d:00:04.0: bridge window [mem 0x340000600000-0x3400007fffff 64bit pref]: assigned May 27 18:07:19.194101 kernel: pci 000d:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.194162 kernel: pci 000d:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.194223 kernel: pci 000d:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.194286 kernel: pci 000d:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.194347 kernel: pci 000d:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.194408 kernel: pci 000d:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.194469 kernel: pci 000d:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.194529 kernel: pci 000d:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.194593 kernel: pci 000d:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.194656 kernel: pci 000d:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.194717 kernel: pci 000d:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.194777 kernel: pci 000d:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.194837 kernel: pci 000d:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.194898 kernel: pci 000d:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.194958 kernel: pci 000d:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.195019 kernel: pci 000d:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.195080 kernel: pci 000d:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.195141 kernel: pci 000d:00:01.0: bridge window [mem 0x50000000-0x501fffff] May 27 18:07:19.195203 kernel: pci 000d:00:01.0: bridge window [mem 0x340000000000-0x3400001fffff 64bit pref] May 27 18:07:19.195265 kernel: pci 000d:00:02.0: PCI bridge to [bus 02] May 27 18:07:19.195325 kernel: pci 000d:00:02.0: bridge window [mem 0x50200000-0x503fffff] May 27 18:07:19.195386 kernel: pci 000d:00:02.0: bridge window [mem 0x340000200000-0x3400003fffff 64bit pref] May 27 18:07:19.195447 kernel: pci 000d:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.195509 kernel: pci 000d:00:03.0: bridge window [mem 0x50400000-0x505fffff] May 27 18:07:19.195572 kernel: pci 000d:00:03.0: bridge window [mem 0x340000400000-0x3400005fffff 64bit pref] May 27 18:07:19.195636 kernel: pci 000d:00:04.0: PCI bridge to [bus 04] May 27 18:07:19.195698 kernel: pci 000d:00:04.0: bridge window [mem 0x50600000-0x507fffff] May 27 18:07:19.195759 kernel: pci 000d:00:04.0: bridge window [mem 0x340000600000-0x3400007fffff 64bit pref] May 27 18:07:19.195814 kernel: pci_bus 000d:00: resource 4 [mem 0x50000000-0x5fffffff window] May 27 18:07:19.195869 kernel: pci_bus 000d:00: resource 5 [mem 0x340000000000-0x37ffdfffffff window] May 27 18:07:19.195935 kernel: pci_bus 000d:01: resource 1 [mem 0x50000000-0x501fffff] May 27 18:07:19.195994 kernel: pci_bus 000d:01: resource 2 [mem 0x340000000000-0x3400001fffff 64bit pref] May 27 18:07:19.196057 kernel: pci_bus 000d:02: resource 1 [mem 0x50200000-0x503fffff] May 27 18:07:19.196113 kernel: pci_bus 000d:02: resource 2 [mem 0x340000200000-0x3400003fffff 64bit pref] May 27 18:07:19.196187 kernel: pci_bus 000d:03: resource 1 [mem 0x50400000-0x505fffff] May 27 18:07:19.196244 kernel: pci_bus 000d:03: resource 2 [mem 0x340000400000-0x3400005fffff 64bit pref] May 27 18:07:19.196306 kernel: pci_bus 000d:04: resource 1 [mem 0x50600000-0x507fffff] May 27 18:07:19.196366 kernel: pci_bus 000d:04: resource 2 [mem 0x340000600000-0x3400007fffff 64bit pref] May 27 18:07:19.196376 kernel: ACPI: PCI Root Bridge [PCI3] (domain 0000 [bus 00-ff]) May 27 18:07:19.196441 kernel: acpi PNP0A08:01: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 18:07:19.196502 kernel: acpi PNP0A08:01: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 18:07:19.196565 kernel: acpi PNP0A08:01: _OSC: OS now controls [AER PCIeCapability] May 27 18:07:19.196643 kernel: acpi PNP0A08:01: MCFG quirk: ECAM at [mem 0x3ffff0000000-0x3fffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 18:07:19.196706 kernel: acpi PNP0A08:01: ECAM area [mem 0x3ffff0000000-0x3fffffffffff] reserved by PNP0C02:00 May 27 18:07:19.196765 kernel: acpi PNP0A08:01: ECAM at [mem 0x3ffff0000000-0x3fffffffffff] for [bus 00-ff] May 27 18:07:19.196774 kernel: PCI host bridge to bus 0000:00 May 27 18:07:19.196839 kernel: pci_bus 0000:00: root bus resource [mem 0x70000000-0x7fffffff window] May 27 18:07:19.196894 kernel: pci_bus 0000:00: root bus resource [mem 0x3c0000000000-0x3fffdfffffff window] May 27 18:07:19.196948 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 27 18:07:19.197018 kernel: pci 0000:00:00.0: [1def:e100] type 00 class 0x060000 conventional PCI endpoint May 27 18:07:19.197093 kernel: pci 0000:00:01.0: [1def:e101] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.197162 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.197226 kernel: pci 0000:00:01.0: enabling Extended Tags May 27 18:07:19.197288 kernel: pci 0000:00:01.0: supports D1 D2 May 27 18:07:19.197352 kernel: pci 0000:00:01.0: PME# supported from D0 D1 D3hot May 27 18:07:19.197422 kernel: pci 0000:00:02.0: [1def:e102] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.197484 kernel: pci 0000:00:02.0: PCI bridge to [bus 02] May 27 18:07:19.197546 kernel: pci 0000:00:02.0: supports D1 D2 May 27 18:07:19.197613 kernel: pci 0000:00:02.0: PME# supported from D0 D1 D3hot May 27 18:07:19.197685 kernel: pci 0000:00:03.0: [1def:e103] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.197747 kernel: pci 0000:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.197807 kernel: pci 0000:00:03.0: supports D1 D2 May 27 18:07:19.197868 kernel: pci 0000:00:03.0: PME# supported from D0 D1 D3hot May 27 18:07:19.197936 kernel: pci 0000:00:04.0: [1def:e104] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.198000 kernel: pci 0000:00:04.0: PCI bridge to [bus 04] May 27 18:07:19.198063 kernel: pci 0000:00:04.0: supports D1 D2 May 27 18:07:19.198124 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D3hot May 27 18:07:19.198133 kernel: acpiphp: Slot [1-1] registered May 27 18:07:19.198141 kernel: acpiphp: Slot [2-1] registered May 27 18:07:19.198148 kernel: acpiphp: Slot [3-1] registered May 27 18:07:19.198155 kernel: acpiphp: Slot [4-1] registered May 27 18:07:19.198209 kernel: pci_bus 0000:00: on NUMA node 0 May 27 18:07:19.198271 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 18:07:19.198336 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.198397 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.198459 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 18:07:19.198519 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.198584 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.198646 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 18:07:19.198710 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.198771 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.198833 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 18:07:19.198895 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.198957 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.199017 kernel: pci 0000:00:01.0: bridge window [mem 0x70000000-0x701fffff]: assigned May 27 18:07:19.199079 kernel: pci 0000:00:01.0: bridge window [mem 0x3c0000000000-0x3c00001fffff 64bit pref]: assigned May 27 18:07:19.199142 kernel: pci 0000:00:02.0: bridge window [mem 0x70200000-0x703fffff]: assigned May 27 18:07:19.199203 kernel: pci 0000:00:02.0: bridge window [mem 0x3c0000200000-0x3c00003fffff 64bit pref]: assigned May 27 18:07:19.199264 kernel: pci 0000:00:03.0: bridge window [mem 0x70400000-0x705fffff]: assigned May 27 18:07:19.199324 kernel: pci 0000:00:03.0: bridge window [mem 0x3c0000400000-0x3c00005fffff 64bit pref]: assigned May 27 18:07:19.199385 kernel: pci 0000:00:04.0: bridge window [mem 0x70600000-0x707fffff]: assigned May 27 18:07:19.199446 kernel: pci 0000:00:04.0: bridge window [mem 0x3c0000600000-0x3c00007fffff 64bit pref]: assigned May 27 18:07:19.199507 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.199567 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.199639 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.199700 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.199761 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.199823 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.199883 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.199943 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.200004 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.200065 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.200128 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.200189 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.200250 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.200310 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.200371 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.200433 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.200494 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.200555 kernel: pci 0000:00:01.0: bridge window [mem 0x70000000-0x701fffff] May 27 18:07:19.200620 kernel: pci 0000:00:01.0: bridge window [mem 0x3c0000000000-0x3c00001fffff 64bit pref] May 27 18:07:19.200682 kernel: pci 0000:00:02.0: PCI bridge to [bus 02] May 27 18:07:19.200744 kernel: pci 0000:00:02.0: bridge window [mem 0x70200000-0x703fffff] May 27 18:07:19.200805 kernel: pci 0000:00:02.0: bridge window [mem 0x3c0000200000-0x3c00003fffff 64bit pref] May 27 18:07:19.200868 kernel: pci 0000:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.200929 kernel: pci 0000:00:03.0: bridge window [mem 0x70400000-0x705fffff] May 27 18:07:19.200989 kernel: pci 0000:00:03.0: bridge window [mem 0x3c0000400000-0x3c00005fffff 64bit pref] May 27 18:07:19.201050 kernel: pci 0000:00:04.0: PCI bridge to [bus 04] May 27 18:07:19.201111 kernel: pci 0000:00:04.0: bridge window [mem 0x70600000-0x707fffff] May 27 18:07:19.201172 kernel: pci 0000:00:04.0: bridge window [mem 0x3c0000600000-0x3c00007fffff 64bit pref] May 27 18:07:19.201229 kernel: pci_bus 0000:00: resource 4 [mem 0x70000000-0x7fffffff window] May 27 18:07:19.201283 kernel: pci_bus 0000:00: resource 5 [mem 0x3c0000000000-0x3fffdfffffff window] May 27 18:07:19.201348 kernel: pci_bus 0000:01: resource 1 [mem 0x70000000-0x701fffff] May 27 18:07:19.201405 kernel: pci_bus 0000:01: resource 2 [mem 0x3c0000000000-0x3c00001fffff 64bit pref] May 27 18:07:19.201469 kernel: pci_bus 0000:02: resource 1 [mem 0x70200000-0x703fffff] May 27 18:07:19.201526 kernel: pci_bus 0000:02: resource 2 [mem 0x3c0000200000-0x3c00003fffff 64bit pref] May 27 18:07:19.201601 kernel: pci_bus 0000:03: resource 1 [mem 0x70400000-0x705fffff] May 27 18:07:19.201661 kernel: pci_bus 0000:03: resource 2 [mem 0x3c0000400000-0x3c00005fffff 64bit pref] May 27 18:07:19.201725 kernel: pci_bus 0000:04: resource 1 [mem 0x70600000-0x707fffff] May 27 18:07:19.201782 kernel: pci_bus 0000:04: resource 2 [mem 0x3c0000600000-0x3c00007fffff 64bit pref] May 27 18:07:19.201791 kernel: ACPI: PCI Root Bridge [PCI7] (domain 0005 [bus 00-ff]) May 27 18:07:19.201860 kernel: acpi PNP0A08:02: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 18:07:19.201920 kernel: acpi PNP0A08:02: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 18:07:19.201981 kernel: acpi PNP0A08:02: _OSC: OS now controls [AER PCIeCapability] May 27 18:07:19.202039 kernel: acpi PNP0A08:02: MCFG quirk: ECAM at [mem 0x2ffff0000000-0x2fffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 18:07:19.202097 kernel: acpi PNP0A08:02: ECAM area [mem 0x2ffff0000000-0x2fffffffffff] reserved by PNP0C02:00 May 27 18:07:19.202154 kernel: acpi PNP0A08:02: ECAM at [mem 0x2ffff0000000-0x2fffffffffff] for [bus 00-ff] May 27 18:07:19.202164 kernel: PCI host bridge to bus 0005:00 May 27 18:07:19.202225 kernel: pci_bus 0005:00: root bus resource [mem 0x30000000-0x3fffffff window] May 27 18:07:19.202279 kernel: pci_bus 0005:00: root bus resource [mem 0x2c0000000000-0x2fffdfffffff window] May 27 18:07:19.202334 kernel: pci_bus 0005:00: root bus resource [bus 00-ff] May 27 18:07:19.202402 kernel: pci 0005:00:00.0: [1def:e110] type 00 class 0x060000 conventional PCI endpoint May 27 18:07:19.202471 kernel: pci 0005:00:01.0: [1def:e111] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.202534 kernel: pci 0005:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.202600 kernel: pci 0005:00:01.0: supports D1 D2 May 27 18:07:19.202662 kernel: pci 0005:00:01.0: PME# supported from D0 D1 D3hot May 27 18:07:19.202730 kernel: pci 0005:00:03.0: [1def:e113] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.202794 kernel: pci 0005:00:03.0: PCI bridge to [bus 02] May 27 18:07:19.202855 kernel: pci 0005:00:03.0: supports D1 D2 May 27 18:07:19.202916 kernel: pci 0005:00:03.0: PME# supported from D0 D1 D3hot May 27 18:07:19.202983 kernel: pci 0005:00:05.0: [1def:e115] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.203046 kernel: pci 0005:00:05.0: PCI bridge to [bus 03] May 27 18:07:19.203107 kernel: pci 0005:00:05.0: bridge window [mem 0x30100000-0x301fffff] May 27 18:07:19.203167 kernel: pci 0005:00:05.0: supports D1 D2 May 27 18:07:19.203229 kernel: pci 0005:00:05.0: PME# supported from D0 D1 D3hot May 27 18:07:19.203297 kernel: pci 0005:00:07.0: [1def:e117] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.203358 kernel: pci 0005:00:07.0: PCI bridge to [bus 04] May 27 18:07:19.203419 kernel: pci 0005:00:07.0: bridge window [mem 0x30000000-0x300fffff] May 27 18:07:19.203479 kernel: pci 0005:00:07.0: supports D1 D2 May 27 18:07:19.203540 kernel: pci 0005:00:07.0: PME# supported from D0 D1 D3hot May 27 18:07:19.203550 kernel: acpiphp: Slot [1-2] registered May 27 18:07:19.203559 kernel: acpiphp: Slot [2-2] registered May 27 18:07:19.203636 kernel: pci 0005:03:00.0: [144d:a808] type 00 class 0x010802 PCIe Endpoint May 27 18:07:19.203704 kernel: pci 0005:03:00.0: BAR 0 [mem 0x30110000-0x30113fff 64bit] May 27 18:07:19.203768 kernel: pci 0005:03:00.0: ROM [mem 0x30100000-0x3010ffff pref] May 27 18:07:19.203838 kernel: pci 0005:04:00.0: [144d:a808] type 00 class 0x010802 PCIe Endpoint May 27 18:07:19.203901 kernel: pci 0005:04:00.0: BAR 0 [mem 0x30010000-0x30013fff 64bit] May 27 18:07:19.203963 kernel: pci 0005:04:00.0: ROM [mem 0x30000000-0x3000ffff pref] May 27 18:07:19.204020 kernel: pci_bus 0005:00: on NUMA node 0 May 27 18:07:19.204081 kernel: pci 0005:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 18:07:19.204143 kernel: pci 0005:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.204205 kernel: pci 0005:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.204266 kernel: pci 0005:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 18:07:19.204329 kernel: pci 0005:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.204390 kernel: pci 0005:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.204453 kernel: pci 0005:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 18:07:19.204515 kernel: pci 0005:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.204575 kernel: pci 0005:00:05.0: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 May 27 18:07:19.204641 kernel: pci 0005:00:07.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 18:07:19.204702 kernel: pci 0005:00:07.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.204764 kernel: pci 0005:00:07.0: bridge window [mem 0x00100000-0x001fffff] to [bus 04] add_size 100000 add_align 100000 May 27 18:07:19.204825 kernel: pci 0005:00:01.0: bridge window [mem 0x30000000-0x301fffff]: assigned May 27 18:07:19.204888 kernel: pci 0005:00:01.0: bridge window [mem 0x2c0000000000-0x2c00001fffff 64bit pref]: assigned May 27 18:07:19.204950 kernel: pci 0005:00:03.0: bridge window [mem 0x30200000-0x303fffff]: assigned May 27 18:07:19.205011 kernel: pci 0005:00:03.0: bridge window [mem 0x2c0000200000-0x2c00003fffff 64bit pref]: assigned May 27 18:07:19.205072 kernel: pci 0005:00:05.0: bridge window [mem 0x30400000-0x305fffff]: assigned May 27 18:07:19.205134 kernel: pci 0005:00:05.0: bridge window [mem 0x2c0000400000-0x2c00005fffff 64bit pref]: assigned May 27 18:07:19.205194 kernel: pci 0005:00:07.0: bridge window [mem 0x30600000-0x307fffff]: assigned May 27 18:07:19.205255 kernel: pci 0005:00:07.0: bridge window [mem 0x2c0000600000-0x2c00007fffff 64bit pref]: assigned May 27 18:07:19.205316 kernel: pci 0005:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.205378 kernel: pci 0005:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.205439 kernel: pci 0005:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.205500 kernel: pci 0005:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.205561 kernel: pci 0005:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.205625 kernel: pci 0005:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.205685 kernel: pci 0005:00:07.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.205746 kernel: pci 0005:00:07.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.205807 kernel: pci 0005:00:07.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.205869 kernel: pci 0005:00:07.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.205930 kernel: pci 0005:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.205991 kernel: pci 0005:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.206053 kernel: pci 0005:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.206114 kernel: pci 0005:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.206175 kernel: pci 0005:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.206236 kernel: pci 0005:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.206298 kernel: pci 0005:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.206359 kernel: pci 0005:00:01.0: bridge window [mem 0x30000000-0x301fffff] May 27 18:07:19.206419 kernel: pci 0005:00:01.0: bridge window [mem 0x2c0000000000-0x2c00001fffff 64bit pref] May 27 18:07:19.206480 kernel: pci 0005:00:03.0: PCI bridge to [bus 02] May 27 18:07:19.206540 kernel: pci 0005:00:03.0: bridge window [mem 0x30200000-0x303fffff] May 27 18:07:19.206605 kernel: pci 0005:00:03.0: bridge window [mem 0x2c0000200000-0x2c00003fffff 64bit pref] May 27 18:07:19.206669 kernel: pci 0005:03:00.0: ROM [mem 0x30400000-0x3040ffff pref]: assigned May 27 18:07:19.206735 kernel: pci 0005:03:00.0: BAR 0 [mem 0x30410000-0x30413fff 64bit]: assigned May 27 18:07:19.206796 kernel: pci 0005:00:05.0: PCI bridge to [bus 03] May 27 18:07:19.206857 kernel: pci 0005:00:05.0: bridge window [mem 0x30400000-0x305fffff] May 27 18:07:19.206918 kernel: pci 0005:00:05.0: bridge window [mem 0x2c0000400000-0x2c00005fffff 64bit pref] May 27 18:07:19.206981 kernel: pci 0005:04:00.0: ROM [mem 0x30600000-0x3060ffff pref]: assigned May 27 18:07:19.207043 kernel: pci 0005:04:00.0: BAR 0 [mem 0x30610000-0x30613fff 64bit]: assigned May 27 18:07:19.207104 kernel: pci 0005:00:07.0: PCI bridge to [bus 04] May 27 18:07:19.207165 kernel: pci 0005:00:07.0: bridge window [mem 0x30600000-0x307fffff] May 27 18:07:19.207227 kernel: pci 0005:00:07.0: bridge window [mem 0x2c0000600000-0x2c00007fffff 64bit pref] May 27 18:07:19.207282 kernel: pci_bus 0005:00: resource 4 [mem 0x30000000-0x3fffffff window] May 27 18:07:19.207336 kernel: pci_bus 0005:00: resource 5 [mem 0x2c0000000000-0x2fffdfffffff window] May 27 18:07:19.207402 kernel: pci_bus 0005:01: resource 1 [mem 0x30000000-0x301fffff] May 27 18:07:19.207459 kernel: pci_bus 0005:01: resource 2 [mem 0x2c0000000000-0x2c00001fffff 64bit pref] May 27 18:07:19.207529 kernel: pci_bus 0005:02: resource 1 [mem 0x30200000-0x303fffff] May 27 18:07:19.207593 kernel: pci_bus 0005:02: resource 2 [mem 0x2c0000200000-0x2c00003fffff 64bit pref] May 27 18:07:19.207659 kernel: pci_bus 0005:03: resource 1 [mem 0x30400000-0x305fffff] May 27 18:07:19.207716 kernel: pci_bus 0005:03: resource 2 [mem 0x2c0000400000-0x2c00005fffff 64bit pref] May 27 18:07:19.207781 kernel: pci_bus 0005:04: resource 1 [mem 0x30600000-0x307fffff] May 27 18:07:19.207837 kernel: pci_bus 0005:04: resource 2 [mem 0x2c0000600000-0x2c00007fffff 64bit pref] May 27 18:07:19.207847 kernel: ACPI: PCI Root Bridge [PCI5] (domain 0003 [bus 00-ff]) May 27 18:07:19.207914 kernel: acpi PNP0A08:03: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 18:07:19.207973 kernel: acpi PNP0A08:03: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 18:07:19.208032 kernel: acpi PNP0A08:03: _OSC: OS now controls [AER PCIeCapability] May 27 18:07:19.208090 kernel: acpi PNP0A08:03: MCFG quirk: ECAM at [mem 0x27fff0000000-0x27ffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 18:07:19.208147 kernel: acpi PNP0A08:03: ECAM area [mem 0x27fff0000000-0x27ffffffffff] reserved by PNP0C02:00 May 27 18:07:19.208205 kernel: acpi PNP0A08:03: ECAM at [mem 0x27fff0000000-0x27ffffffffff] for [bus 00-ff] May 27 18:07:19.208215 kernel: PCI host bridge to bus 0003:00 May 27 18:07:19.208277 kernel: pci_bus 0003:00: root bus resource [mem 0x10000000-0x1fffffff window] May 27 18:07:19.208332 kernel: pci_bus 0003:00: root bus resource [mem 0x240000000000-0x27ffdfffffff window] May 27 18:07:19.208385 kernel: pci_bus 0003:00: root bus resource [bus 00-ff] May 27 18:07:19.208453 kernel: pci 0003:00:00.0: [1def:e110] type 00 class 0x060000 conventional PCI endpoint May 27 18:07:19.208522 kernel: pci 0003:00:01.0: [1def:e111] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.208589 kernel: pci 0003:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.208654 kernel: pci 0003:00:01.0: supports D1 D2 May 27 18:07:19.208716 kernel: pci 0003:00:01.0: PME# supported from D0 D1 D3hot May 27 18:07:19.208785 kernel: pci 0003:00:03.0: [1def:e113] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.208847 kernel: pci 0003:00:03.0: PCI bridge to [bus 02] May 27 18:07:19.208908 kernel: pci 0003:00:03.0: supports D1 D2 May 27 18:07:19.208969 kernel: pci 0003:00:03.0: PME# supported from D0 D1 D3hot May 27 18:07:19.209037 kernel: pci 0003:00:05.0: [1def:e115] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.209098 kernel: pci 0003:00:05.0: PCI bridge to [bus 03-04] May 27 18:07:19.209161 kernel: pci 0003:00:05.0: bridge window [io 0x0000-0x0fff] May 27 18:07:19.209222 kernel: pci 0003:00:05.0: bridge window [mem 0x10000000-0x100fffff] May 27 18:07:19.209283 kernel: pci 0003:00:05.0: bridge window [mem 0x240000000000-0x2400000fffff 64bit pref] May 27 18:07:19.209346 kernel: pci 0003:00:05.0: supports D1 D2 May 27 18:07:19.209408 kernel: pci 0003:00:05.0: PME# supported from D0 D1 D3hot May 27 18:07:19.209418 kernel: acpiphp: Slot [1-3] registered May 27 18:07:19.209425 kernel: acpiphp: Slot [2-3] registered May 27 18:07:19.209495 kernel: pci 0003:03:00.0: [8086:1521] type 00 class 0x020000 PCIe Endpoint May 27 18:07:19.209559 kernel: pci 0003:03:00.0: BAR 0 [mem 0x10020000-0x1003ffff] May 27 18:07:19.209641 kernel: pci 0003:03:00.0: BAR 2 [io 0x0020-0x003f] May 27 18:07:19.209705 kernel: pci 0003:03:00.0: BAR 3 [mem 0x10044000-0x10047fff] May 27 18:07:19.209768 kernel: pci 0003:03:00.0: PME# supported from D0 D3hot D3cold May 27 18:07:19.209830 kernel: pci 0003:03:00.0: VF BAR 0 [mem 0x240000060000-0x240000063fff 64bit pref] May 27 18:07:19.209895 kernel: pci 0003:03:00.0: VF BAR 0 [mem 0x240000060000-0x24000007ffff 64bit pref]: contains BAR 0 for 8 VFs May 27 18:07:19.209959 kernel: pci 0003:03:00.0: VF BAR 3 [mem 0x240000040000-0x240000043fff 64bit pref] May 27 18:07:19.210022 kernel: pci 0003:03:00.0: VF BAR 3 [mem 0x240000040000-0x24000005ffff 64bit pref]: contains BAR 3 for 8 VFs May 27 18:07:19.210085 kernel: pci 0003:03:00.0: 8.000 Gb/s available PCIe bandwidth, limited by 5.0 GT/s PCIe x2 link at 0003:00:05.0 (capable of 16.000 Gb/s with 5.0 GT/s PCIe x4 link) May 27 18:07:19.210155 kernel: pci 0003:03:00.1: [8086:1521] type 00 class 0x020000 PCIe Endpoint May 27 18:07:19.210219 kernel: pci 0003:03:00.1: BAR 0 [mem 0x10000000-0x1001ffff] May 27 18:07:19.210281 kernel: pci 0003:03:00.1: BAR 2 [io 0x0000-0x001f] May 27 18:07:19.210344 kernel: pci 0003:03:00.1: BAR 3 [mem 0x10040000-0x10043fff] May 27 18:07:19.210406 kernel: pci 0003:03:00.1: PME# supported from D0 D3hot D3cold May 27 18:07:19.210471 kernel: pci 0003:03:00.1: VF BAR 0 [mem 0x240000020000-0x240000023fff 64bit pref] May 27 18:07:19.210534 kernel: pci 0003:03:00.1: VF BAR 0 [mem 0x240000020000-0x24000003ffff 64bit pref]: contains BAR 0 for 8 VFs May 27 18:07:19.210630 kernel: pci 0003:03:00.1: VF BAR 3 [mem 0x240000000000-0x240000003fff 64bit pref] May 27 18:07:19.210707 kernel: pci 0003:03:00.1: VF BAR 3 [mem 0x240000000000-0x24000001ffff 64bit pref]: contains BAR 3 for 8 VFs May 27 18:07:19.210764 kernel: pci_bus 0003:00: on NUMA node 0 May 27 18:07:19.210827 kernel: pci 0003:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 18:07:19.210889 kernel: pci 0003:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.210953 kernel: pci 0003:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.211015 kernel: pci 0003:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 18:07:19.211077 kernel: pci 0003:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.211140 kernel: pci 0003:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.211202 kernel: pci 0003:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03-04] add_size 300000 add_align 100000 May 27 18:07:19.211263 kernel: pci 0003:00:05.0: bridge window [mem 0x00100000-0x001fffff] to [bus 03-04] add_size 100000 add_align 100000 May 27 18:07:19.211324 kernel: pci 0003:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned May 27 18:07:19.211387 kernel: pci 0003:00:01.0: bridge window [mem 0x240000000000-0x2400001fffff 64bit pref]: assigned May 27 18:07:19.211449 kernel: pci 0003:00:03.0: bridge window [mem 0x10200000-0x103fffff]: assigned May 27 18:07:19.211510 kernel: pci 0003:00:03.0: bridge window [mem 0x240000200000-0x2400003fffff 64bit pref]: assigned May 27 18:07:19.211571 kernel: pci 0003:00:05.0: bridge window [mem 0x10400000-0x105fffff]: assigned May 27 18:07:19.211638 kernel: pci 0003:00:05.0: bridge window [mem 0x240000400000-0x2400006fffff 64bit pref]: assigned May 27 18:07:19.211700 kernel: pci 0003:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.211761 kernel: pci 0003:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.211822 kernel: pci 0003:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.211885 kernel: pci 0003:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.211946 kernel: pci 0003:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.212008 kernel: pci 0003:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.212073 kernel: pci 0003:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.212134 kernel: pci 0003:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.212195 kernel: pci 0003:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.212258 kernel: pci 0003:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.212319 kernel: pci 0003:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.212379 kernel: pci 0003:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.212440 kernel: pci 0003:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.212501 kernel: pci 0003:00:01.0: bridge window [mem 0x10000000-0x101fffff] May 27 18:07:19.212562 kernel: pci 0003:00:01.0: bridge window [mem 0x240000000000-0x2400001fffff 64bit pref] May 27 18:07:19.212630 kernel: pci 0003:00:03.0: PCI bridge to [bus 02] May 27 18:07:19.212694 kernel: pci 0003:00:03.0: bridge window [mem 0x10200000-0x103fffff] May 27 18:07:19.212755 kernel: pci 0003:00:03.0: bridge window [mem 0x240000200000-0x2400003fffff 64bit pref] May 27 18:07:19.212819 kernel: pci 0003:03:00.0: BAR 0 [mem 0x10400000-0x1041ffff]: assigned May 27 18:07:19.212881 kernel: pci 0003:03:00.1: BAR 0 [mem 0x10420000-0x1043ffff]: assigned May 27 18:07:19.212944 kernel: pci 0003:03:00.0: BAR 3 [mem 0x10440000-0x10443fff]: assigned May 27 18:07:19.213006 kernel: pci 0003:03:00.0: VF BAR 0 [mem 0x240000400000-0x24000041ffff 64bit pref]: assigned May 27 18:07:19.213069 kernel: pci 0003:03:00.0: VF BAR 3 [mem 0x240000420000-0x24000043ffff 64bit pref]: assigned May 27 18:07:19.213131 kernel: pci 0003:03:00.1: BAR 3 [mem 0x10444000-0x10447fff]: assigned May 27 18:07:19.213196 kernel: pci 0003:03:00.1: VF BAR 0 [mem 0x240000440000-0x24000045ffff 64bit pref]: assigned May 27 18:07:19.213258 kernel: pci 0003:03:00.1: VF BAR 3 [mem 0x240000460000-0x24000047ffff 64bit pref]: assigned May 27 18:07:19.213321 kernel: pci 0003:03:00.0: BAR 2 [io size 0x0020]: can't assign; no space May 27 18:07:19.213384 kernel: pci 0003:03:00.0: BAR 2 [io size 0x0020]: failed to assign May 27 18:07:19.213448 kernel: pci 0003:03:00.1: BAR 2 [io size 0x0020]: can't assign; no space May 27 18:07:19.213510 kernel: pci 0003:03:00.1: BAR 2 [io size 0x0020]: failed to assign May 27 18:07:19.213574 kernel: pci 0003:03:00.0: BAR 2 [io size 0x0020]: can't assign; no space May 27 18:07:19.213641 kernel: pci 0003:03:00.0: BAR 2 [io size 0x0020]: failed to assign May 27 18:07:19.213705 kernel: pci 0003:03:00.1: BAR 2 [io size 0x0020]: can't assign; no space May 27 18:07:19.213768 kernel: pci 0003:03:00.1: BAR 2 [io size 0x0020]: failed to assign May 27 18:07:19.213829 kernel: pci 0003:00:05.0: PCI bridge to [bus 03-04] May 27 18:07:19.213891 kernel: pci 0003:00:05.0: bridge window [mem 0x10400000-0x105fffff] May 27 18:07:19.213952 kernel: pci 0003:00:05.0: bridge window [mem 0x240000400000-0x2400006fffff 64bit pref] May 27 18:07:19.214007 kernel: pci_bus 0003:00: Some PCI device resources are unassigned, try booting with pci=realloc May 27 18:07:19.214061 kernel: pci_bus 0003:00: resource 4 [mem 0x10000000-0x1fffffff window] May 27 18:07:19.214117 kernel: pci_bus 0003:00: resource 5 [mem 0x240000000000-0x27ffdfffffff window] May 27 18:07:19.214182 kernel: pci_bus 0003:01: resource 1 [mem 0x10000000-0x101fffff] May 27 18:07:19.214239 kernel: pci_bus 0003:01: resource 2 [mem 0x240000000000-0x2400001fffff 64bit pref] May 27 18:07:19.214312 kernel: pci_bus 0003:02: resource 1 [mem 0x10200000-0x103fffff] May 27 18:07:19.214368 kernel: pci_bus 0003:02: resource 2 [mem 0x240000200000-0x2400003fffff 64bit pref] May 27 18:07:19.214431 kernel: pci_bus 0003:03: resource 1 [mem 0x10400000-0x105fffff] May 27 18:07:19.214489 kernel: pci_bus 0003:03: resource 2 [mem 0x240000400000-0x2400006fffff 64bit pref] May 27 18:07:19.214500 kernel: ACPI: PCI Root Bridge [PCI0] (domain 000c [bus 00-ff]) May 27 18:07:19.214565 kernel: acpi PNP0A08:04: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 18:07:19.214629 kernel: acpi PNP0A08:04: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 18:07:19.214690 kernel: acpi PNP0A08:04: _OSC: OS now controls [AER PCIeCapability] May 27 18:07:19.214749 kernel: acpi PNP0A08:04: MCFG quirk: ECAM at [mem 0x33fff0000000-0x33ffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 18:07:19.214807 kernel: acpi PNP0A08:04: ECAM area [mem 0x33fff0000000-0x33ffffffffff] reserved by PNP0C02:00 May 27 18:07:19.214867 kernel: acpi PNP0A08:04: ECAM at [mem 0x33fff0000000-0x33ffffffffff] for [bus 00-ff] May 27 18:07:19.214877 kernel: PCI host bridge to bus 000c:00 May 27 18:07:19.214947 kernel: pci_bus 000c:00: root bus resource [mem 0x40000000-0x4fffffff window] May 27 18:07:19.215003 kernel: pci_bus 000c:00: root bus resource [mem 0x300000000000-0x33ffdfffffff window] May 27 18:07:19.215057 kernel: pci_bus 000c:00: root bus resource [bus 00-ff] May 27 18:07:19.215126 kernel: pci 000c:00:00.0: [1def:e100] type 00 class 0x060000 conventional PCI endpoint May 27 18:07:19.215194 kernel: pci 000c:00:01.0: [1def:e101] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.215259 kernel: pci 000c:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.215321 kernel: pci 000c:00:01.0: enabling Extended Tags May 27 18:07:19.215381 kernel: pci 000c:00:01.0: supports D1 D2 May 27 18:07:19.215443 kernel: pci 000c:00:01.0: PME# supported from D0 D1 D3hot May 27 18:07:19.215511 kernel: pci 000c:00:02.0: [1def:e102] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.215573 kernel: pci 000c:00:02.0: PCI bridge to [bus 02] May 27 18:07:19.215639 kernel: pci 000c:00:02.0: supports D1 D2 May 27 18:07:19.215703 kernel: pci 000c:00:02.0: PME# supported from D0 D1 D3hot May 27 18:07:19.215771 kernel: pci 000c:00:03.0: [1def:e103] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.215832 kernel: pci 000c:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.215894 kernel: pci 000c:00:03.0: supports D1 D2 May 27 18:07:19.215955 kernel: pci 000c:00:03.0: PME# supported from D0 D1 D3hot May 27 18:07:19.216022 kernel: pci 000c:00:04.0: [1def:e104] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.216084 kernel: pci 000c:00:04.0: PCI bridge to [bus 04] May 27 18:07:19.216148 kernel: pci 000c:00:04.0: supports D1 D2 May 27 18:07:19.216209 kernel: pci 000c:00:04.0: PME# supported from D0 D1 D3hot May 27 18:07:19.216219 kernel: acpiphp: Slot [1-4] registered May 27 18:07:19.216227 kernel: acpiphp: Slot [2-4] registered May 27 18:07:19.216235 kernel: acpiphp: Slot [3-2] registered May 27 18:07:19.216242 kernel: acpiphp: Slot [4-2] registered May 27 18:07:19.216296 kernel: pci_bus 000c:00: on NUMA node 0 May 27 18:07:19.216358 kernel: pci 000c:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 18:07:19.216423 kernel: pci 000c:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.216484 kernel: pci 000c:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.216545 kernel: pci 000c:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 18:07:19.216613 kernel: pci 000c:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.216675 kernel: pci 000c:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.216737 kernel: pci 000c:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 18:07:19.216799 kernel: pci 000c:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.216862 kernel: pci 000c:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.216926 kernel: pci 000c:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 18:07:19.216988 kernel: pci 000c:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.217050 kernel: pci 000c:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.217111 kernel: pci 000c:00:01.0: bridge window [mem 0x40000000-0x401fffff]: assigned May 27 18:07:19.217173 kernel: pci 000c:00:01.0: bridge window [mem 0x300000000000-0x3000001fffff 64bit pref]: assigned May 27 18:07:19.217234 kernel: pci 000c:00:02.0: bridge window [mem 0x40200000-0x403fffff]: assigned May 27 18:07:19.217297 kernel: pci 000c:00:02.0: bridge window [mem 0x300000200000-0x3000003fffff 64bit pref]: assigned May 27 18:07:19.217359 kernel: pci 000c:00:03.0: bridge window [mem 0x40400000-0x405fffff]: assigned May 27 18:07:19.217421 kernel: pci 000c:00:03.0: bridge window [mem 0x300000400000-0x3000005fffff 64bit pref]: assigned May 27 18:07:19.217482 kernel: pci 000c:00:04.0: bridge window [mem 0x40600000-0x407fffff]: assigned May 27 18:07:19.217543 kernel: pci 000c:00:04.0: bridge window [mem 0x300000600000-0x3000007fffff 64bit pref]: assigned May 27 18:07:19.217609 kernel: pci 000c:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.217671 kernel: pci 000c:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.217732 kernel: pci 000c:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.217795 kernel: pci 000c:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.217856 kernel: pci 000c:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.217917 kernel: pci 000c:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.217978 kernel: pci 000c:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.218039 kernel: pci 000c:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.218099 kernel: pci 000c:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.218160 kernel: pci 000c:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.218221 kernel: pci 000c:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.218281 kernel: pci 000c:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.218344 kernel: pci 000c:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.218405 kernel: pci 000c:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.218466 kernel: pci 000c:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.218527 kernel: pci 000c:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.218592 kernel: pci 000c:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.218654 kernel: pci 000c:00:01.0: bridge window [mem 0x40000000-0x401fffff] May 27 18:07:19.218716 kernel: pci 000c:00:01.0: bridge window [mem 0x300000000000-0x3000001fffff 64bit pref] May 27 18:07:19.218780 kernel: pci 000c:00:02.0: PCI bridge to [bus 02] May 27 18:07:19.218844 kernel: pci 000c:00:02.0: bridge window [mem 0x40200000-0x403fffff] May 27 18:07:19.218905 kernel: pci 000c:00:02.0: bridge window [mem 0x300000200000-0x3000003fffff 64bit pref] May 27 18:07:19.218966 kernel: pci 000c:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.219028 kernel: pci 000c:00:03.0: bridge window [mem 0x40400000-0x405fffff] May 27 18:07:19.219089 kernel: pci 000c:00:03.0: bridge window [mem 0x300000400000-0x3000005fffff 64bit pref] May 27 18:07:19.219151 kernel: pci 000c:00:04.0: PCI bridge to [bus 04] May 27 18:07:19.219212 kernel: pci 000c:00:04.0: bridge window [mem 0x40600000-0x407fffff] May 27 18:07:19.219273 kernel: pci 000c:00:04.0: bridge window [mem 0x300000600000-0x3000007fffff 64bit pref] May 27 18:07:19.219328 kernel: pci_bus 000c:00: resource 4 [mem 0x40000000-0x4fffffff window] May 27 18:07:19.219382 kernel: pci_bus 000c:00: resource 5 [mem 0x300000000000-0x33ffdfffffff window] May 27 18:07:19.219448 kernel: pci_bus 000c:01: resource 1 [mem 0x40000000-0x401fffff] May 27 18:07:19.219506 kernel: pci_bus 000c:01: resource 2 [mem 0x300000000000-0x3000001fffff 64bit pref] May 27 18:07:19.219569 kernel: pci_bus 000c:02: resource 1 [mem 0x40200000-0x403fffff] May 27 18:07:19.219631 kernel: pci_bus 000c:02: resource 2 [mem 0x300000200000-0x3000003fffff 64bit pref] May 27 18:07:19.219703 kernel: pci_bus 000c:03: resource 1 [mem 0x40400000-0x405fffff] May 27 18:07:19.219760 kernel: pci_bus 000c:03: resource 2 [mem 0x300000400000-0x3000005fffff 64bit pref] May 27 18:07:19.219827 kernel: pci_bus 000c:04: resource 1 [mem 0x40600000-0x407fffff] May 27 18:07:19.219884 kernel: pci_bus 000c:04: resource 2 [mem 0x300000600000-0x3000007fffff 64bit pref] May 27 18:07:19.219894 kernel: ACPI: PCI Root Bridge [PCI4] (domain 0002 [bus 00-ff]) May 27 18:07:19.219962 kernel: acpi PNP0A08:05: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 18:07:19.220022 kernel: acpi PNP0A08:05: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 18:07:19.220081 kernel: acpi PNP0A08:05: _OSC: OS now controls [AER PCIeCapability] May 27 18:07:19.220139 kernel: acpi PNP0A08:05: MCFG quirk: ECAM at [mem 0x23fff0000000-0x23ffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 18:07:19.220199 kernel: acpi PNP0A08:05: ECAM area [mem 0x23fff0000000-0x23ffffffffff] reserved by PNP0C02:00 May 27 18:07:19.220258 kernel: acpi PNP0A08:05: ECAM at [mem 0x23fff0000000-0x23ffffffffff] for [bus 00-ff] May 27 18:07:19.220268 kernel: PCI host bridge to bus 0002:00 May 27 18:07:19.220329 kernel: pci_bus 0002:00: root bus resource [mem 0x00800000-0x0fffffff window] May 27 18:07:19.220384 kernel: pci_bus 0002:00: root bus resource [mem 0x200000000000-0x23ffdfffffff window] May 27 18:07:19.220438 kernel: pci_bus 0002:00: root bus resource [bus 00-ff] May 27 18:07:19.220507 kernel: pci 0002:00:00.0: [1def:e110] type 00 class 0x060000 conventional PCI endpoint May 27 18:07:19.220586 kernel: pci 0002:00:01.0: [1def:e111] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.220652 kernel: pci 0002:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.220714 kernel: pci 0002:00:01.0: supports D1 D2 May 27 18:07:19.220775 kernel: pci 0002:00:01.0: PME# supported from D0 D1 D3hot May 27 18:07:19.220843 kernel: pci 0002:00:03.0: [1def:e113] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.220905 kernel: pci 0002:00:03.0: PCI bridge to [bus 02] May 27 18:07:19.220967 kernel: pci 0002:00:03.0: supports D1 D2 May 27 18:07:19.221031 kernel: pci 0002:00:03.0: PME# supported from D0 D1 D3hot May 27 18:07:19.221098 kernel: pci 0002:00:05.0: [1def:e115] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.221162 kernel: pci 0002:00:05.0: PCI bridge to [bus 03] May 27 18:07:19.221223 kernel: pci 0002:00:05.0: supports D1 D2 May 27 18:07:19.221285 kernel: pci 0002:00:05.0: PME# supported from D0 D1 D3hot May 27 18:07:19.221353 kernel: pci 0002:00:07.0: [1def:e117] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.221415 kernel: pci 0002:00:07.0: PCI bridge to [bus 04] May 27 18:07:19.221478 kernel: pci 0002:00:07.0: supports D1 D2 May 27 18:07:19.221539 kernel: pci 0002:00:07.0: PME# supported from D0 D1 D3hot May 27 18:07:19.221549 kernel: acpiphp: Slot [1-5] registered May 27 18:07:19.221557 kernel: acpiphp: Slot [2-5] registered May 27 18:07:19.221565 kernel: acpiphp: Slot [3-3] registered May 27 18:07:19.221573 kernel: acpiphp: Slot [4-3] registered May 27 18:07:19.221631 kernel: pci_bus 0002:00: on NUMA node 0 May 27 18:07:19.221693 kernel: pci 0002:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 18:07:19.221757 kernel: pci 0002:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.221818 kernel: pci 0002:00:01.0: bridge window [mem 0x00100000-0x000fffff] to [bus 01] add_size 200000 add_align 100000 May 27 18:07:19.221880 kernel: pci 0002:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 18:07:19.221942 kernel: pci 0002:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.222002 kernel: pci 0002:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.222064 kernel: pci 0002:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 18:07:19.222126 kernel: pci 0002:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.222189 kernel: pci 0002:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.222252 kernel: pci 0002:00:07.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 18:07:19.222313 kernel: pci 0002:00:07.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.222374 kernel: pci 0002:00:07.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.222435 kernel: pci 0002:00:01.0: bridge window [mem 0x00800000-0x009fffff]: assigned May 27 18:07:19.222497 kernel: pci 0002:00:01.0: bridge window [mem 0x200000000000-0x2000001fffff 64bit pref]: assigned May 27 18:07:19.222558 kernel: pci 0002:00:03.0: bridge window [mem 0x00a00000-0x00bfffff]: assigned May 27 18:07:19.222627 kernel: pci 0002:00:03.0: bridge window [mem 0x200000200000-0x2000003fffff 64bit pref]: assigned May 27 18:07:19.222689 kernel: pci 0002:00:05.0: bridge window [mem 0x00c00000-0x00dfffff]: assigned May 27 18:07:19.222750 kernel: pci 0002:00:05.0: bridge window [mem 0x200000400000-0x2000005fffff 64bit pref]: assigned May 27 18:07:19.222812 kernel: pci 0002:00:07.0: bridge window [mem 0x00e00000-0x00ffffff]: assigned May 27 18:07:19.222874 kernel: pci 0002:00:07.0: bridge window [mem 0x200000600000-0x2000007fffff 64bit pref]: assigned May 27 18:07:19.222936 kernel: pci 0002:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.222997 kernel: pci 0002:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.223058 kernel: pci 0002:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.223121 kernel: pci 0002:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.223184 kernel: pci 0002:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.223245 kernel: pci 0002:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.223306 kernel: pci 0002:00:07.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.223367 kernel: pci 0002:00:07.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.223428 kernel: pci 0002:00:07.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.223489 kernel: pci 0002:00:07.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.223552 kernel: pci 0002:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.223618 kernel: pci 0002:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.223680 kernel: pci 0002:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.223740 kernel: pci 0002:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.223802 kernel: pci 0002:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.223863 kernel: pci 0002:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.223924 kernel: pci 0002:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.223985 kernel: pci 0002:00:01.0: bridge window [mem 0x00800000-0x009fffff] May 27 18:07:19.224047 kernel: pci 0002:00:01.0: bridge window [mem 0x200000000000-0x2000001fffff 64bit pref] May 27 18:07:19.224110 kernel: pci 0002:00:03.0: PCI bridge to [bus 02] May 27 18:07:19.224171 kernel: pci 0002:00:03.0: bridge window [mem 0x00a00000-0x00bfffff] May 27 18:07:19.224233 kernel: pci 0002:00:03.0: bridge window [mem 0x200000200000-0x2000003fffff 64bit pref] May 27 18:07:19.224294 kernel: pci 0002:00:05.0: PCI bridge to [bus 03] May 27 18:07:19.224356 kernel: pci 0002:00:05.0: bridge window [mem 0x00c00000-0x00dfffff] May 27 18:07:19.224417 kernel: pci 0002:00:05.0: bridge window [mem 0x200000400000-0x2000005fffff 64bit pref] May 27 18:07:19.224478 kernel: pci 0002:00:07.0: PCI bridge to [bus 04] May 27 18:07:19.224541 kernel: pci 0002:00:07.0: bridge window [mem 0x00e00000-0x00ffffff] May 27 18:07:19.224617 kernel: pci 0002:00:07.0: bridge window [mem 0x200000600000-0x2000007fffff 64bit pref] May 27 18:07:19.224674 kernel: pci_bus 0002:00: resource 4 [mem 0x00800000-0x0fffffff window] May 27 18:07:19.224729 kernel: pci_bus 0002:00: resource 5 [mem 0x200000000000-0x23ffdfffffff window] May 27 18:07:19.224793 kernel: pci_bus 0002:01: resource 1 [mem 0x00800000-0x009fffff] May 27 18:07:19.224850 kernel: pci_bus 0002:01: resource 2 [mem 0x200000000000-0x2000001fffff 64bit pref] May 27 18:07:19.224917 kernel: pci_bus 0002:02: resource 1 [mem 0x00a00000-0x00bfffff] May 27 18:07:19.224974 kernel: pci_bus 0002:02: resource 2 [mem 0x200000200000-0x2000003fffff 64bit pref] May 27 18:07:19.225037 kernel: pci_bus 0002:03: resource 1 [mem 0x00c00000-0x00dfffff] May 27 18:07:19.225094 kernel: pci_bus 0002:03: resource 2 [mem 0x200000400000-0x2000005fffff 64bit pref] May 27 18:07:19.225165 kernel: pci_bus 0002:04: resource 1 [mem 0x00e00000-0x00ffffff] May 27 18:07:19.225223 kernel: pci_bus 0002:04: resource 2 [mem 0x200000600000-0x2000007fffff 64bit pref] May 27 18:07:19.225234 kernel: ACPI: PCI Root Bridge [PCI2] (domain 0001 [bus 00-ff]) May 27 18:07:19.225301 kernel: acpi PNP0A08:06: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 18:07:19.225361 kernel: acpi PNP0A08:06: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 18:07:19.225420 kernel: acpi PNP0A08:06: _OSC: OS now controls [AER PCIeCapability] May 27 18:07:19.225478 kernel: acpi PNP0A08:06: MCFG quirk: ECAM at [mem 0x3bfff0000000-0x3bffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 18:07:19.225537 kernel: acpi PNP0A08:06: ECAM area [mem 0x3bfff0000000-0x3bffffffffff] reserved by PNP0C02:00 May 27 18:07:19.225603 kernel: acpi PNP0A08:06: ECAM at [mem 0x3bfff0000000-0x3bffffffffff] for [bus 00-ff] May 27 18:07:19.225614 kernel: PCI host bridge to bus 0001:00 May 27 18:07:19.225674 kernel: pci_bus 0001:00: root bus resource [mem 0x60000000-0x6fffffff window] May 27 18:07:19.225730 kernel: pci_bus 0001:00: root bus resource [mem 0x380000000000-0x3bffdfffffff window] May 27 18:07:19.225784 kernel: pci_bus 0001:00: root bus resource [bus 00-ff] May 27 18:07:19.225852 kernel: pci 0001:00:00.0: [1def:e100] type 00 class 0x060000 conventional PCI endpoint May 27 18:07:19.225921 kernel: pci 0001:00:01.0: [1def:e101] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.225986 kernel: pci 0001:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.226047 kernel: pci 0001:00:01.0: bridge window [mem 0x60000000-0x601fffff] May 27 18:07:19.226108 kernel: pci 0001:00:01.0: bridge window [mem 0x380000000000-0x380003ffffff 64bit pref] May 27 18:07:19.226169 kernel: pci 0001:00:01.0: enabling Extended Tags May 27 18:07:19.226231 kernel: pci 0001:00:01.0: supports D1 D2 May 27 18:07:19.226293 kernel: pci 0001:00:01.0: PME# supported from D0 D1 D3hot May 27 18:07:19.226363 kernel: pci 0001:00:02.0: [1def:e102] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.226427 kernel: pci 0001:00:02.0: PCI bridge to [bus 02] May 27 18:07:19.226489 kernel: pci 0001:00:02.0: supports D1 D2 May 27 18:07:19.226549 kernel: pci 0001:00:02.0: PME# supported from D0 D1 D3hot May 27 18:07:19.226622 kernel: pci 0001:00:03.0: [1def:e103] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.226685 kernel: pci 0001:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.226746 kernel: pci 0001:00:03.0: supports D1 D2 May 27 18:07:19.226807 kernel: pci 0001:00:03.0: PME# supported from D0 D1 D3hot May 27 18:07:19.226876 kernel: pci 0001:00:04.0: [1def:e104] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.226938 kernel: pci 0001:00:04.0: PCI bridge to [bus 04] May 27 18:07:19.226999 kernel: pci 0001:00:04.0: supports D1 D2 May 27 18:07:19.227060 kernel: pci 0001:00:04.0: PME# supported from D0 D1 D3hot May 27 18:07:19.227070 kernel: acpiphp: Slot [1-6] registered May 27 18:07:19.227138 kernel: pci 0001:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint May 27 18:07:19.227202 kernel: pci 0001:01:00.0: BAR 0 [mem 0x380002000000-0x380003ffffff 64bit pref] May 27 18:07:19.227267 kernel: pci 0001:01:00.0: ROM [mem 0x60100000-0x601fffff pref] May 27 18:07:19.227330 kernel: pci 0001:01:00.0: PME# supported from D3cold May 27 18:07:19.227393 kernel: pci 0001:01:00.0: 31.504 Gb/s available PCIe bandwidth, limited by 8.0 GT/s PCIe x4 link at 0001:00:01.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) May 27 18:07:19.227462 kernel: pci 0001:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint May 27 18:07:19.227525 kernel: pci 0001:01:00.1: BAR 0 [mem 0x380000000000-0x380001ffffff 64bit pref] May 27 18:07:19.227592 kernel: pci 0001:01:00.1: ROM [mem 0x60000000-0x600fffff pref] May 27 18:07:19.227656 kernel: pci 0001:01:00.1: PME# supported from D3cold May 27 18:07:19.227668 kernel: acpiphp: Slot [2-6] registered May 27 18:07:19.227676 kernel: acpiphp: Slot [3-4] registered May 27 18:07:19.227683 kernel: acpiphp: Slot [4-4] registered May 27 18:07:19.227737 kernel: pci_bus 0001:00: on NUMA node 0 May 27 18:07:19.227800 kernel: pci 0001:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 27 18:07:19.227863 kernel: pci 0001:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 27 18:07:19.227928 kernel: pci 0001:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.227990 kernel: pci 0001:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 02] add_size 200000 add_align 100000 May 27 18:07:19.228053 kernel: pci 0001:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 18:07:19.228115 kernel: pci 0001:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.228177 kernel: pci 0001:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.228239 kernel: pci 0001:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 18:07:19.228301 kernel: pci 0001:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.228362 kernel: pci 0001:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.228424 kernel: pci 0001:00:01.0: bridge window [mem 0x380000000000-0x380003ffffff 64bit pref]: assigned May 27 18:07:19.228488 kernel: pci 0001:00:01.0: bridge window [mem 0x60000000-0x601fffff]: assigned May 27 18:07:19.228549 kernel: pci 0001:00:02.0: bridge window [mem 0x60200000-0x603fffff]: assigned May 27 18:07:19.228615 kernel: pci 0001:00:02.0: bridge window [mem 0x380004000000-0x3800041fffff 64bit pref]: assigned May 27 18:07:19.228677 kernel: pci 0001:00:03.0: bridge window [mem 0x60400000-0x605fffff]: assigned May 27 18:07:19.228738 kernel: pci 0001:00:03.0: bridge window [mem 0x380004200000-0x3800043fffff 64bit pref]: assigned May 27 18:07:19.228799 kernel: pci 0001:00:04.0: bridge window [mem 0x60600000-0x607fffff]: assigned May 27 18:07:19.228861 kernel: pci 0001:00:04.0: bridge window [mem 0x380004400000-0x3800045fffff 64bit pref]: assigned May 27 18:07:19.228923 kernel: pci 0001:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.228987 kernel: pci 0001:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.229048 kernel: pci 0001:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.229109 kernel: pci 0001:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.229172 kernel: pci 0001:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.229233 kernel: pci 0001:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.229295 kernel: pci 0001:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.229356 kernel: pci 0001:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.229417 kernel: pci 0001:00:04.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.229480 kernel: pci 0001:00:04.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.229541 kernel: pci 0001:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.229605 kernel: pci 0001:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.229667 kernel: pci 0001:00:02.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.229728 kernel: pci 0001:00:02.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.229789 kernel: pci 0001:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.229850 kernel: pci 0001:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.229914 kernel: pci 0001:01:00.0: BAR 0 [mem 0x380000000000-0x380001ffffff 64bit pref]: assigned May 27 18:07:19.229980 kernel: pci 0001:01:00.1: BAR 0 [mem 0x380002000000-0x380003ffffff 64bit pref]: assigned May 27 18:07:19.230043 kernel: pci 0001:01:00.0: ROM [mem 0x60000000-0x600fffff pref]: assigned May 27 18:07:19.230107 kernel: pci 0001:01:00.1: ROM [mem 0x60100000-0x601fffff pref]: assigned May 27 18:07:19.230168 kernel: pci 0001:00:01.0: PCI bridge to [bus 01] May 27 18:07:19.230230 kernel: pci 0001:00:01.0: bridge window [mem 0x60000000-0x601fffff] May 27 18:07:19.230291 kernel: pci 0001:00:01.0: bridge window [mem 0x380000000000-0x380003ffffff 64bit pref] May 27 18:07:19.230352 kernel: pci 0001:00:02.0: PCI bridge to [bus 02] May 27 18:07:19.230416 kernel: pci 0001:00:02.0: bridge window [mem 0x60200000-0x603fffff] May 27 18:07:19.230479 kernel: pci 0001:00:02.0: bridge window [mem 0x380004000000-0x3800041fffff 64bit pref] May 27 18:07:19.230541 kernel: pci 0001:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.230606 kernel: pci 0001:00:03.0: bridge window [mem 0x60400000-0x605fffff] May 27 18:07:19.230668 kernel: pci 0001:00:03.0: bridge window [mem 0x380004200000-0x3800043fffff 64bit pref] May 27 18:07:19.230730 kernel: pci 0001:00:04.0: PCI bridge to [bus 04] May 27 18:07:19.230791 kernel: pci 0001:00:04.0: bridge window [mem 0x60600000-0x607fffff] May 27 18:07:19.230854 kernel: pci 0001:00:04.0: bridge window [mem 0x380004400000-0x3800045fffff 64bit pref] May 27 18:07:19.230910 kernel: pci_bus 0001:00: resource 4 [mem 0x60000000-0x6fffffff window] May 27 18:07:19.230964 kernel: pci_bus 0001:00: resource 5 [mem 0x380000000000-0x3bffdfffffff window] May 27 18:07:19.231029 kernel: pci_bus 0001:01: resource 1 [mem 0x60000000-0x601fffff] May 27 18:07:19.231086 kernel: pci_bus 0001:01: resource 2 [mem 0x380000000000-0x380003ffffff 64bit pref] May 27 18:07:19.231158 kernel: pci_bus 0001:02: resource 1 [mem 0x60200000-0x603fffff] May 27 18:07:19.231217 kernel: pci_bus 0001:02: resource 2 [mem 0x380004000000-0x3800041fffff 64bit pref] May 27 18:07:19.231281 kernel: pci_bus 0001:03: resource 1 [mem 0x60400000-0x605fffff] May 27 18:07:19.231337 kernel: pci_bus 0001:03: resource 2 [mem 0x380004200000-0x3800043fffff 64bit pref] May 27 18:07:19.231401 kernel: pci_bus 0001:04: resource 1 [mem 0x60600000-0x607fffff] May 27 18:07:19.231458 kernel: pci_bus 0001:04: resource 2 [mem 0x380004400000-0x3800045fffff 64bit pref] May 27 18:07:19.231468 kernel: ACPI: PCI Root Bridge [PCI6] (domain 0004 [bus 00-ff]) May 27 18:07:19.231532 kernel: acpi PNP0A08:07: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 18:07:19.231599 kernel: acpi PNP0A08:07: _OSC: platform does not support [PCIeHotplug PME LTR] May 27 18:07:19.231660 kernel: acpi PNP0A08:07: _OSC: OS now controls [AER PCIeCapability] May 27 18:07:19.231719 kernel: acpi PNP0A08:07: MCFG quirk: ECAM at [mem 0x2bfff0000000-0x2bffffffffff] for [bus 00-ff] with pci_32b_read_ops May 27 18:07:19.231777 kernel: acpi PNP0A08:07: ECAM area [mem 0x2bfff0000000-0x2bffffffffff] reserved by PNP0C02:00 May 27 18:07:19.231835 kernel: acpi PNP0A08:07: ECAM at [mem 0x2bfff0000000-0x2bffffffffff] for [bus 00-ff] May 27 18:07:19.231846 kernel: PCI host bridge to bus 0004:00 May 27 18:07:19.231906 kernel: pci_bus 0004:00: root bus resource [mem 0x20000000-0x2fffffff window] May 27 18:07:19.231963 kernel: pci_bus 0004:00: root bus resource [mem 0x280000000000-0x2bffdfffffff window] May 27 18:07:19.232017 kernel: pci_bus 0004:00: root bus resource [bus 00-ff] May 27 18:07:19.232087 kernel: pci 0004:00:00.0: [1def:e110] type 00 class 0x060000 conventional PCI endpoint May 27 18:07:19.232158 kernel: pci 0004:00:01.0: [1def:e111] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.232221 kernel: pci 0004:00:01.0: PCI bridge to [bus 01-02] May 27 18:07:19.232282 kernel: pci 0004:00:01.0: bridge window [io 0x0000-0x0fff] May 27 18:07:19.232345 kernel: pci 0004:00:01.0: bridge window [mem 0x20000000-0x220fffff] May 27 18:07:19.232406 kernel: pci 0004:00:01.0: supports D1 D2 May 27 18:07:19.232466 kernel: pci 0004:00:01.0: PME# supported from D0 D1 D3hot May 27 18:07:19.232533 kernel: pci 0004:00:03.0: [1def:e113] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.232599 kernel: pci 0004:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.232661 kernel: pci 0004:00:03.0: bridge window [mem 0x22200000-0x222fffff] May 27 18:07:19.232722 kernel: pci 0004:00:03.0: supports D1 D2 May 27 18:07:19.232783 kernel: pci 0004:00:03.0: PME# supported from D0 D1 D3hot May 27 18:07:19.232856 kernel: pci 0004:00:05.0: [1def:e115] type 01 class 0x060400 PCIe Root Port May 27 18:07:19.232918 kernel: pci 0004:00:05.0: PCI bridge to [bus 04] May 27 18:07:19.232979 kernel: pci 0004:00:05.0: supports D1 D2 May 27 18:07:19.233040 kernel: pci 0004:00:05.0: PME# supported from D0 D1 D3hot May 27 18:07:19.233109 kernel: pci 0004:01:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge May 27 18:07:19.233174 kernel: pci 0004:01:00.0: PCI bridge to [bus 02] May 27 18:07:19.233236 kernel: pci 0004:01:00.0: bridge window [io 0x0000-0x0fff] May 27 18:07:19.233301 kernel: pci 0004:01:00.0: bridge window [mem 0x20000000-0x220fffff] May 27 18:07:19.233364 kernel: pci 0004:01:00.0: enabling Extended Tags May 27 18:07:19.233426 kernel: pci 0004:01:00.0: supports D1 D2 May 27 18:07:19.233489 kernel: pci 0004:01:00.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 18:07:19.233555 kernel: pci_bus 0004:02: extended config space not accessible May 27 18:07:19.233633 kernel: pci 0004:02:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint May 27 18:07:19.233699 kernel: pci 0004:02:00.0: BAR 0 [mem 0x20000000-0x21ffffff] May 27 18:07:19.233766 kernel: pci 0004:02:00.0: BAR 1 [mem 0x22000000-0x2201ffff] May 27 18:07:19.233831 kernel: pci 0004:02:00.0: BAR 2 [io 0x0000-0x007f] May 27 18:07:19.233896 kernel: pci 0004:02:00.0: supports D1 D2 May 27 18:07:19.233961 kernel: pci 0004:02:00.0: PME# supported from D0 D1 D2 D3hot D3cold May 27 18:07:19.234037 kernel: pci 0004:03:00.0: [1912:0014] type 00 class 0x0c0330 PCIe Endpoint May 27 18:07:19.234102 kernel: pci 0004:03:00.0: BAR 0 [mem 0x22200000-0x22201fff 64bit] May 27 18:07:19.234164 kernel: pci 0004:03:00.0: PME# supported from D0 D3hot D3cold May 27 18:07:19.234222 kernel: pci_bus 0004:00: on NUMA node 0 May 27 18:07:19.234285 kernel: pci 0004:00:01.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 01-02] add_size 200000 add_align 100000 May 27 18:07:19.234347 kernel: pci 0004:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 18:07:19.234409 kernel: pci 0004:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 27 18:07:19.234471 kernel: pci 0004:00:03.0: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 May 27 18:07:19.234534 kernel: pci 0004:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 18:07:19.234599 kernel: pci 0004:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.234663 kernel: pci 0004:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 27 18:07:19.234726 kernel: pci 0004:00:01.0: bridge window [mem 0x20000000-0x22ffffff]: assigned May 27 18:07:19.234787 kernel: pci 0004:00:01.0: bridge window [mem 0x280000000000-0x2800001fffff 64bit pref]: assigned May 27 18:07:19.234849 kernel: pci 0004:00:03.0: bridge window [mem 0x23000000-0x231fffff]: assigned May 27 18:07:19.234910 kernel: pci 0004:00:03.0: bridge window [mem 0x280000200000-0x2800003fffff 64bit pref]: assigned May 27 18:07:19.234972 kernel: pci 0004:00:05.0: bridge window [mem 0x23200000-0x233fffff]: assigned May 27 18:07:19.235034 kernel: pci 0004:00:05.0: bridge window [mem 0x280000400000-0x2800005fffff 64bit pref]: assigned May 27 18:07:19.235097 kernel: pci 0004:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.235158 kernel: pci 0004:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.235219 kernel: pci 0004:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.235282 kernel: pci 0004:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.235344 kernel: pci 0004:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.235405 kernel: pci 0004:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.235470 kernel: pci 0004:00:01.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.235532 kernel: pci 0004:00:01.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.235599 kernel: pci 0004:00:05.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.235662 kernel: pci 0004:00:05.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.235723 kernel: pci 0004:00:03.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.235784 kernel: pci 0004:00:03.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.235849 kernel: pci 0004:01:00.0: bridge window [mem 0x20000000-0x22ffffff]: assigned May 27 18:07:19.235912 kernel: pci 0004:01:00.0: bridge window [io size 0x1000]: can't assign; no space May 27 18:07:19.235975 kernel: pci 0004:01:00.0: bridge window [io size 0x1000]: failed to assign May 27 18:07:19.236041 kernel: pci 0004:02:00.0: BAR 0 [mem 0x20000000-0x21ffffff]: assigned May 27 18:07:19.236108 kernel: pci 0004:02:00.0: BAR 1 [mem 0x22000000-0x2201ffff]: assigned May 27 18:07:19.236173 kernel: pci 0004:02:00.0: BAR 2 [io size 0x0080]: can't assign; no space May 27 18:07:19.236238 kernel: pci 0004:02:00.0: BAR 2 [io size 0x0080]: failed to assign May 27 18:07:19.236317 kernel: pci 0004:01:00.0: PCI bridge to [bus 02] May 27 18:07:19.236381 kernel: pci 0004:01:00.0: bridge window [mem 0x20000000-0x22ffffff] May 27 18:07:19.236443 kernel: pci 0004:00:01.0: PCI bridge to [bus 01-02] May 27 18:07:19.236504 kernel: pci 0004:00:01.0: bridge window [mem 0x20000000-0x22ffffff] May 27 18:07:19.236567 kernel: pci 0004:00:01.0: bridge window [mem 0x280000000000-0x2800001fffff 64bit pref] May 27 18:07:19.236651 kernel: pci 0004:03:00.0: BAR 0 [mem 0x23000000-0x23001fff 64bit]: assigned May 27 18:07:19.236714 kernel: pci 0004:00:03.0: PCI bridge to [bus 03] May 27 18:07:19.236778 kernel: pci 0004:00:03.0: bridge window [mem 0x23000000-0x231fffff] May 27 18:07:19.236839 kernel: pci 0004:00:03.0: bridge window [mem 0x280000200000-0x2800003fffff 64bit pref] May 27 18:07:19.236901 kernel: pci 0004:00:05.0: PCI bridge to [bus 04] May 27 18:07:19.236963 kernel: pci 0004:00:05.0: bridge window [mem 0x23200000-0x233fffff] May 27 18:07:19.237024 kernel: pci 0004:00:05.0: bridge window [mem 0x280000400000-0x2800005fffff 64bit pref] May 27 18:07:19.237083 kernel: pci_bus 0004:00: Some PCI device resources are unassigned, try booting with pci=realloc May 27 18:07:19.237137 kernel: pci_bus 0004:00: resource 4 [mem 0x20000000-0x2fffffff window] May 27 18:07:19.237192 kernel: pci_bus 0004:00: resource 5 [mem 0x280000000000-0x2bffdfffffff window] May 27 18:07:19.237256 kernel: pci_bus 0004:01: resource 1 [mem 0x20000000-0x22ffffff] May 27 18:07:19.237313 kernel: pci_bus 0004:01: resource 2 [mem 0x280000000000-0x2800001fffff 64bit pref] May 27 18:07:19.237374 kernel: pci_bus 0004:02: resource 1 [mem 0x20000000-0x22ffffff] May 27 18:07:19.237440 kernel: pci_bus 0004:03: resource 1 [mem 0x23000000-0x231fffff] May 27 18:07:19.237497 kernel: pci_bus 0004:03: resource 2 [mem 0x280000200000-0x2800003fffff 64bit pref] May 27 18:07:19.237560 kernel: pci_bus 0004:04: resource 1 [mem 0x23200000-0x233fffff] May 27 18:07:19.237623 kernel: pci_bus 0004:04: resource 2 [mem 0x280000400000-0x2800005fffff 64bit pref] May 27 18:07:19.237633 kernel: ACPI: CPU18 has been hot-added May 27 18:07:19.237641 kernel: ACPI: CPU58 has been hot-added May 27 18:07:19.237649 kernel: ACPI: CPU38 has been hot-added May 27 18:07:19.237657 kernel: ACPI: CPU78 has been hot-added May 27 18:07:19.237666 kernel: ACPI: CPU16 has been hot-added May 27 18:07:19.237675 kernel: ACPI: CPU56 has been hot-added May 27 18:07:19.237682 kernel: ACPI: CPU36 has been hot-added May 27 18:07:19.237690 kernel: ACPI: CPU76 has been hot-added May 27 18:07:19.237698 kernel: ACPI: CPU17 has been hot-added May 27 18:07:19.237705 kernel: ACPI: CPU57 has been hot-added May 27 18:07:19.237713 kernel: ACPI: CPU37 has been hot-added May 27 18:07:19.237721 kernel: ACPI: CPU77 has been hot-added May 27 18:07:19.237729 kernel: ACPI: CPU19 has been hot-added May 27 18:07:19.237737 kernel: ACPI: CPU59 has been hot-added May 27 18:07:19.237745 kernel: ACPI: CPU39 has been hot-added May 27 18:07:19.237753 kernel: ACPI: CPU79 has been hot-added May 27 18:07:19.237761 kernel: ACPI: CPU12 has been hot-added May 27 18:07:19.237769 kernel: ACPI: CPU52 has been hot-added May 27 18:07:19.237776 kernel: ACPI: CPU32 has been hot-added May 27 18:07:19.237784 kernel: ACPI: CPU72 has been hot-added May 27 18:07:19.237791 kernel: ACPI: CPU8 has been hot-added May 27 18:07:19.237799 kernel: ACPI: CPU48 has been hot-added May 27 18:07:19.237807 kernel: ACPI: CPU28 has been hot-added May 27 18:07:19.237816 kernel: ACPI: CPU68 has been hot-added May 27 18:07:19.237823 kernel: ACPI: CPU10 has been hot-added May 27 18:07:19.237831 kernel: ACPI: CPU50 has been hot-added May 27 18:07:19.237839 kernel: ACPI: CPU30 has been hot-added May 27 18:07:19.237847 kernel: ACPI: CPU70 has been hot-added May 27 18:07:19.237854 kernel: ACPI: CPU14 has been hot-added May 27 18:07:19.237862 kernel: ACPI: CPU54 has been hot-added May 27 18:07:19.237869 kernel: ACPI: CPU34 has been hot-added May 27 18:07:19.237877 kernel: ACPI: CPU74 has been hot-added May 27 18:07:19.237886 kernel: ACPI: CPU4 has been hot-added May 27 18:07:19.237894 kernel: ACPI: CPU44 has been hot-added May 27 18:07:19.237902 kernel: ACPI: CPU24 has been hot-added May 27 18:07:19.237909 kernel: ACPI: CPU64 has been hot-added May 27 18:07:19.237917 kernel: ACPI: CPU0 has been hot-added May 27 18:07:19.237925 kernel: ACPI: CPU40 has been hot-added May 27 18:07:19.237933 kernel: ACPI: CPU20 has been hot-added May 27 18:07:19.237940 kernel: ACPI: CPU60 has been hot-added May 27 18:07:19.237948 kernel: ACPI: CPU2 has been hot-added May 27 18:07:19.237956 kernel: ACPI: CPU42 has been hot-added May 27 18:07:19.237965 kernel: ACPI: CPU22 has been hot-added May 27 18:07:19.237973 kernel: ACPI: CPU62 has been hot-added May 27 18:07:19.237980 kernel: ACPI: CPU6 has been hot-added May 27 18:07:19.237988 kernel: ACPI: CPU46 has been hot-added May 27 18:07:19.237997 kernel: ACPI: CPU26 has been hot-added May 27 18:07:19.238005 kernel: ACPI: CPU66 has been hot-added May 27 18:07:19.238013 kernel: ACPI: CPU5 has been hot-added May 27 18:07:19.238021 kernel: ACPI: CPU45 has been hot-added May 27 18:07:19.238028 kernel: ACPI: CPU25 has been hot-added May 27 18:07:19.238037 kernel: ACPI: CPU65 has been hot-added May 27 18:07:19.238045 kernel: ACPI: CPU1 has been hot-added May 27 18:07:19.238053 kernel: ACPI: CPU41 has been hot-added May 27 18:07:19.238060 kernel: ACPI: CPU21 has been hot-added May 27 18:07:19.238068 kernel: ACPI: CPU61 has been hot-added May 27 18:07:19.238076 kernel: ACPI: CPU3 has been hot-added May 27 18:07:19.238083 kernel: ACPI: CPU43 has been hot-added May 27 18:07:19.238091 kernel: ACPI: CPU23 has been hot-added May 27 18:07:19.238098 kernel: ACPI: CPU63 has been hot-added May 27 18:07:19.238106 kernel: ACPI: CPU7 has been hot-added May 27 18:07:19.238115 kernel: ACPI: CPU47 has been hot-added May 27 18:07:19.238123 kernel: ACPI: CPU27 has been hot-added May 27 18:07:19.238130 kernel: ACPI: CPU67 has been hot-added May 27 18:07:19.238138 kernel: ACPI: CPU13 has been hot-added May 27 18:07:19.238146 kernel: ACPI: CPU53 has been hot-added May 27 18:07:19.238154 kernel: ACPI: CPU33 has been hot-added May 27 18:07:19.238161 kernel: ACPI: CPU73 has been hot-added May 27 18:07:19.238169 kernel: ACPI: CPU9 has been hot-added May 27 18:07:19.238177 kernel: ACPI: CPU49 has been hot-added May 27 18:07:19.238186 kernel: ACPI: CPU29 has been hot-added May 27 18:07:19.238193 kernel: ACPI: CPU69 has been hot-added May 27 18:07:19.238201 kernel: ACPI: CPU11 has been hot-added May 27 18:07:19.238209 kernel: ACPI: CPU51 has been hot-added May 27 18:07:19.238216 kernel: ACPI: CPU31 has been hot-added May 27 18:07:19.238224 kernel: ACPI: CPU71 has been hot-added May 27 18:07:19.238232 kernel: ACPI: CPU15 has been hot-added May 27 18:07:19.238239 kernel: ACPI: CPU55 has been hot-added May 27 18:07:19.238247 kernel: ACPI: CPU35 has been hot-added May 27 18:07:19.238255 kernel: ACPI: CPU75 has been hot-added May 27 18:07:19.238264 kernel: iommu: Default domain type: Translated May 27 18:07:19.238271 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 27 18:07:19.238279 kernel: efivars: Registered efivars operations May 27 18:07:19.238346 kernel: pci 0004:02:00.0: vgaarb: setting as boot VGA device May 27 18:07:19.238412 kernel: pci 0004:02:00.0: vgaarb: bridge control possible May 27 18:07:19.238477 kernel: pci 0004:02:00.0: vgaarb: VGA device added: decodes=io+mem,owns=none,locks=none May 27 18:07:19.238487 kernel: vgaarb: loaded May 27 18:07:19.238495 kernel: clocksource: Switched to clocksource arch_sys_counter May 27 18:07:19.238504 kernel: VFS: Disk quotas dquot_6.6.0 May 27 18:07:19.238512 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 18:07:19.238520 kernel: pnp: PnP ACPI init May 27 18:07:19.238590 kernel: system 00:00: [mem 0x3bfff0000000-0x3bffffffffff window] could not be reserved May 27 18:07:19.238649 kernel: system 00:00: [mem 0x3ffff0000000-0x3fffffffffff window] could not be reserved May 27 18:07:19.238705 kernel: system 00:00: [mem 0x23fff0000000-0x23ffffffffff window] could not be reserved May 27 18:07:19.238760 kernel: system 00:00: [mem 0x27fff0000000-0x27ffffffffff window] could not be reserved May 27 18:07:19.238815 kernel: system 00:00: [mem 0x2bfff0000000-0x2bffffffffff window] could not be reserved May 27 18:07:19.238872 kernel: system 00:00: [mem 0x2ffff0000000-0x2fffffffffff window] could not be reserved May 27 18:07:19.238928 kernel: system 00:00: [mem 0x33fff0000000-0x33ffffffffff window] could not be reserved May 27 18:07:19.238983 kernel: system 00:00: [mem 0x37fff0000000-0x37ffffffffff window] could not be reserved May 27 18:07:19.238993 kernel: pnp: PnP ACPI: found 1 devices May 27 18:07:19.239001 kernel: NET: Registered PF_INET protocol family May 27 18:07:19.239009 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 18:07:19.239017 kernel: tcp_listen_portaddr_hash hash table entries: 65536 (order: 8, 1048576 bytes, linear) May 27 18:07:19.239027 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 18:07:19.239035 kernel: TCP established hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 27 18:07:19.239043 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 27 18:07:19.239051 kernel: TCP: Hash tables configured (established 524288 bind 65536) May 27 18:07:19.239060 kernel: UDP hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 27 18:07:19.239068 kernel: UDP-Lite hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 27 18:07:19.239075 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 18:07:19.239139 kernel: pci 0001:01:00.0: CLS mismatch (64 != 32), using 64 bytes May 27 18:07:19.239149 kernel: kvm [1]: nv: 554 coarse grained trap handlers May 27 18:07:19.239158 kernel: kvm [1]: IPA Size Limit: 48 bits May 27 18:07:19.239166 kernel: kvm [1]: GICv3: no GICV resource entry May 27 18:07:19.239174 kernel: kvm [1]: disabling GICv2 emulation May 27 18:07:19.239182 kernel: kvm [1]: GIC system register CPU interface enabled May 27 18:07:19.239189 kernel: kvm [1]: vgic interrupt IRQ9 May 27 18:07:19.239197 kernel: kvm [1]: VHE mode initialized successfully May 27 18:07:19.239205 kernel: Initialise system trusted keyrings May 27 18:07:19.239212 kernel: workingset: timestamp_bits=39 max_order=26 bucket_order=0 May 27 18:07:19.239220 kernel: Key type asymmetric registered May 27 18:07:19.239229 kernel: Asymmetric key parser 'x509' registered May 27 18:07:19.239236 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) May 27 18:07:19.239244 kernel: io scheduler mq-deadline registered May 27 18:07:19.239252 kernel: io scheduler kyber registered May 27 18:07:19.239259 kernel: io scheduler bfq registered May 27 18:07:19.239267 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 27 18:07:19.239275 kernel: ACPI: button: Power Button [PWRB] May 27 18:07:19.239282 kernel: ACPI GTDT: found 1 SBSA generic Watchdog(s). May 27 18:07:19.239290 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 18:07:19.239358 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: option mask 0x0 May 27 18:07:19.239420 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: IDR0.COHACC overridden by FW configuration (false) May 27 18:07:19.239478 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 18:07:19.239535 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: allocated 262144 entries for cmdq May 27 18:07:19.239596 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: allocated 131072 entries for evtq May 27 18:07:19.239653 kernel: arm-smmu-v3 arm-smmu-v3.0.auto: allocated 262144 entries for priq May 27 18:07:19.239719 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: option mask 0x0 May 27 18:07:19.239779 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: IDR0.COHACC overridden by FW configuration (false) May 27 18:07:19.239836 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 18:07:19.239892 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: allocated 262144 entries for cmdq May 27 18:07:19.239948 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: allocated 131072 entries for evtq May 27 18:07:19.240004 kernel: arm-smmu-v3 arm-smmu-v3.1.auto: allocated 262144 entries for priq May 27 18:07:19.240068 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: option mask 0x0 May 27 18:07:19.240127 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: IDR0.COHACC overridden by FW configuration (false) May 27 18:07:19.240183 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 18:07:19.240239 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: allocated 262144 entries for cmdq May 27 18:07:19.240296 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: allocated 131072 entries for evtq May 27 18:07:19.240352 kernel: arm-smmu-v3 arm-smmu-v3.2.auto: allocated 262144 entries for priq May 27 18:07:19.240415 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: option mask 0x0 May 27 18:07:19.240472 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: IDR0.COHACC overridden by FW configuration (false) May 27 18:07:19.240531 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 18:07:19.240591 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: allocated 262144 entries for cmdq May 27 18:07:19.240648 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: allocated 131072 entries for evtq May 27 18:07:19.240705 kernel: arm-smmu-v3 arm-smmu-v3.3.auto: allocated 262144 entries for priq May 27 18:07:19.240768 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: option mask 0x0 May 27 18:07:19.240825 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: IDR0.COHACC overridden by FW configuration (false) May 27 18:07:19.240884 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 18:07:19.240941 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: allocated 262144 entries for cmdq May 27 18:07:19.240997 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: allocated 131072 entries for evtq May 27 18:07:19.241054 kernel: arm-smmu-v3 arm-smmu-v3.4.auto: allocated 262144 entries for priq May 27 18:07:19.241117 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: option mask 0x0 May 27 18:07:19.241175 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: IDR0.COHACC overridden by FW configuration (false) May 27 18:07:19.241231 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 18:07:19.241290 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: allocated 262144 entries for cmdq May 27 18:07:19.241346 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: allocated 131072 entries for evtq May 27 18:07:19.241403 kernel: arm-smmu-v3 arm-smmu-v3.5.auto: allocated 262144 entries for priq May 27 18:07:19.241473 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: option mask 0x0 May 27 18:07:19.241532 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: IDR0.COHACC overridden by FW configuration (false) May 27 18:07:19.241595 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 18:07:19.241652 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: allocated 262144 entries for cmdq May 27 18:07:19.241713 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: allocated 131072 entries for evtq May 27 18:07:19.241770 kernel: arm-smmu-v3 arm-smmu-v3.6.auto: allocated 262144 entries for priq May 27 18:07:19.241835 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: option mask 0x0 May 27 18:07:19.241894 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: IDR0.COHACC overridden by FW configuration (false) May 27 18:07:19.241951 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: ias 48-bit, oas 48-bit (features 0x001c1eff) May 27 18:07:19.242009 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: allocated 262144 entries for cmdq May 27 18:07:19.242066 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: allocated 131072 entries for evtq May 27 18:07:19.242122 kernel: arm-smmu-v3 arm-smmu-v3.7.auto: allocated 262144 entries for priq May 27 18:07:19.242132 kernel: thunder_xcv, ver 1.0 May 27 18:07:19.242140 kernel: thunder_bgx, ver 1.0 May 27 18:07:19.242148 kernel: nicpf, ver 1.0 May 27 18:07:19.242155 kernel: nicvf, ver 1.0 May 27 18:07:19.242218 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 27 18:07:19.242278 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-27T18:07:17 UTC (1748369237) May 27 18:07:19.242288 kernel: efifb: probing for efifb May 27 18:07:19.242296 kernel: efifb: framebuffer at 0x20000000, using 1876k, total 1875k May 27 18:07:19.242303 kernel: efifb: mode is 800x600x32, linelength=3200, pages=1 May 27 18:07:19.242311 kernel: efifb: scrolling: redraw May 27 18:07:19.242319 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 27 18:07:19.242327 kernel: Console: switching to colour frame buffer device 100x37 May 27 18:07:19.242334 kernel: fb0: EFI VGA frame buffer device May 27 18:07:19.242342 kernel: SMCCC: SOC_ID: ID = jep106:0a16:0001 Revision = 0x000000a1 May 27 18:07:19.242351 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 18:07:19.242359 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available May 27 18:07:19.242367 kernel: watchdog: NMI not fully supported May 27 18:07:19.242375 kernel: NET: Registered PF_INET6 protocol family May 27 18:07:19.242382 kernel: watchdog: Hard watchdog permanently disabled May 27 18:07:19.242390 kernel: Segment Routing with IPv6 May 27 18:07:19.242397 kernel: In-situ OAM (IOAM) with IPv6 May 27 18:07:19.242405 kernel: NET: Registered PF_PACKET protocol family May 27 18:07:19.242413 kernel: Key type dns_resolver registered May 27 18:07:19.242422 kernel: registered taskstats version 1 May 27 18:07:19.242429 kernel: Loading compiled-in X.509 certificates May 27 18:07:19.242437 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 8e5e45c34fa91568ef1fa3bdfd5a71a43b4c4580' May 27 18:07:19.242445 kernel: Demotion targets for Node 0: null May 27 18:07:19.242452 kernel: Key type .fscrypt registered May 27 18:07:19.242460 kernel: Key type fscrypt-provisioning registered May 27 18:07:19.242467 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 18:07:19.242475 kernel: ima: Allocated hash algorithm: sha1 May 27 18:07:19.242483 kernel: ima: No architecture policies found May 27 18:07:19.242492 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 27 18:07:19.242555 kernel: pcieport 000d:00:01.0: Adding to iommu group 0 May 27 18:07:19.242623 kernel: pcieport 000d:00:01.0: AER: enabled with IRQ 91 May 27 18:07:19.242687 kernel: pcieport 000d:00:02.0: Adding to iommu group 1 May 27 18:07:19.242750 kernel: pcieport 000d:00:02.0: AER: enabled with IRQ 91 May 27 18:07:19.242813 kernel: pcieport 000d:00:03.0: Adding to iommu group 2 May 27 18:07:19.242875 kernel: pcieport 000d:00:03.0: AER: enabled with IRQ 91 May 27 18:07:19.242938 kernel: pcieport 000d:00:04.0: Adding to iommu group 3 May 27 18:07:19.243000 kernel: pcieport 000d:00:04.0: AER: enabled with IRQ 91 May 27 18:07:19.243066 kernel: pcieport 0000:00:01.0: Adding to iommu group 4 May 27 18:07:19.243128 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 92 May 27 18:07:19.243191 kernel: pcieport 0000:00:02.0: Adding to iommu group 5 May 27 18:07:19.243253 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 92 May 27 18:07:19.243316 kernel: pcieport 0000:00:03.0: Adding to iommu group 6 May 27 18:07:19.243377 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 92 May 27 18:07:19.243440 kernel: pcieport 0000:00:04.0: Adding to iommu group 7 May 27 18:07:19.243502 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 92 May 27 18:07:19.243567 kernel: pcieport 0005:00:01.0: Adding to iommu group 8 May 27 18:07:19.243633 kernel: pcieport 0005:00:01.0: AER: enabled with IRQ 93 May 27 18:07:19.243697 kernel: pcieport 0005:00:03.0: Adding to iommu group 9 May 27 18:07:19.243759 kernel: pcieport 0005:00:03.0: AER: enabled with IRQ 93 May 27 18:07:19.243823 kernel: pcieport 0005:00:05.0: Adding to iommu group 10 May 27 18:07:19.243884 kernel: pcieport 0005:00:05.0: AER: enabled with IRQ 93 May 27 18:07:19.243947 kernel: pcieport 0005:00:07.0: Adding to iommu group 11 May 27 18:07:19.244009 kernel: pcieport 0005:00:07.0: AER: enabled with IRQ 93 May 27 18:07:19.244075 kernel: pcieport 0003:00:01.0: Adding to iommu group 12 May 27 18:07:19.244137 kernel: pcieport 0003:00:01.0: AER: enabled with IRQ 94 May 27 18:07:19.244200 kernel: pcieport 0003:00:03.0: Adding to iommu group 13 May 27 18:07:19.244264 kernel: pcieport 0003:00:03.0: AER: enabled with IRQ 94 May 27 18:07:19.244326 kernel: pcieport 0003:00:05.0: Adding to iommu group 14 May 27 18:07:19.244388 kernel: pcieport 0003:00:05.0: AER: enabled with IRQ 94 May 27 18:07:19.244451 kernel: pcieport 000c:00:01.0: Adding to iommu group 15 May 27 18:07:19.244513 kernel: pcieport 000c:00:01.0: AER: enabled with IRQ 95 May 27 18:07:19.244575 kernel: pcieport 000c:00:02.0: Adding to iommu group 16 May 27 18:07:19.244644 kernel: pcieport 000c:00:02.0: AER: enabled with IRQ 95 May 27 18:07:19.244707 kernel: pcieport 000c:00:03.0: Adding to iommu group 17 May 27 18:07:19.244770 kernel: pcieport 000c:00:03.0: AER: enabled with IRQ 95 May 27 18:07:19.244833 kernel: pcieport 000c:00:04.0: Adding to iommu group 18 May 27 18:07:19.244895 kernel: pcieport 000c:00:04.0: AER: enabled with IRQ 95 May 27 18:07:19.244958 kernel: pcieport 0002:00:01.0: Adding to iommu group 19 May 27 18:07:19.245020 kernel: pcieport 0002:00:01.0: AER: enabled with IRQ 96 May 27 18:07:19.245083 kernel: pcieport 0002:00:03.0: Adding to iommu group 20 May 27 18:07:19.245146 kernel: pcieport 0002:00:03.0: AER: enabled with IRQ 96 May 27 18:07:19.245210 kernel: pcieport 0002:00:05.0: Adding to iommu group 21 May 27 18:07:19.245272 kernel: pcieport 0002:00:05.0: AER: enabled with IRQ 96 May 27 18:07:19.245334 kernel: pcieport 0002:00:07.0: Adding to iommu group 22 May 27 18:07:19.245396 kernel: pcieport 0002:00:07.0: AER: enabled with IRQ 96 May 27 18:07:19.245459 kernel: pcieport 0001:00:01.0: Adding to iommu group 23 May 27 18:07:19.245521 kernel: pcieport 0001:00:01.0: AER: enabled with IRQ 97 May 27 18:07:19.245611 kernel: pcieport 0001:00:02.0: Adding to iommu group 24 May 27 18:07:19.245687 kernel: pcieport 0001:00:02.0: AER: enabled with IRQ 97 May 27 18:07:19.245753 kernel: pcieport 0001:00:03.0: Adding to iommu group 25 May 27 18:07:19.245818 kernel: pcieport 0001:00:03.0: AER: enabled with IRQ 97 May 27 18:07:19.245882 kernel: pcieport 0001:00:04.0: Adding to iommu group 26 May 27 18:07:19.245944 kernel: pcieport 0001:00:04.0: AER: enabled with IRQ 97 May 27 18:07:19.246006 kernel: pcieport 0004:00:01.0: Adding to iommu group 27 May 27 18:07:19.246068 kernel: pcieport 0004:00:01.0: AER: enabled with IRQ 98 May 27 18:07:19.246130 kernel: pcieport 0004:00:03.0: Adding to iommu group 28 May 27 18:07:19.246194 kernel: pcieport 0004:00:03.0: AER: enabled with IRQ 98 May 27 18:07:19.246256 kernel: pcieport 0004:00:05.0: Adding to iommu group 29 May 27 18:07:19.246317 kernel: pcieport 0004:00:05.0: AER: enabled with IRQ 98 May 27 18:07:19.246382 kernel: pcieport 0004:01:00.0: Adding to iommu group 30 May 27 18:07:19.246392 kernel: clk: Disabling unused clocks May 27 18:07:19.246400 kernel: PM: genpd: Disabling unused power domains May 27 18:07:19.246408 kernel: Warning: unable to open an initial console. May 27 18:07:19.246415 kernel: Freeing unused kernel memory: 39424K May 27 18:07:19.246423 kernel: Run /init as init process May 27 18:07:19.246433 kernel: with arguments: May 27 18:07:19.246441 kernel: /init May 27 18:07:19.246448 kernel: with environment: May 27 18:07:19.246455 kernel: HOME=/ May 27 18:07:19.246463 kernel: TERM=linux May 27 18:07:19.246470 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 18:07:19.246479 systemd[1]: Successfully made /usr/ read-only. May 27 18:07:19.246490 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 18:07:19.246500 systemd[1]: Detected architecture arm64. May 27 18:07:19.246508 systemd[1]: Running in initrd. May 27 18:07:19.246516 systemd[1]: No hostname configured, using default hostname. May 27 18:07:19.246524 systemd[1]: Hostname set to . May 27 18:07:19.246533 systemd[1]: Initializing machine ID from random generator. May 27 18:07:19.246541 systemd[1]: Queued start job for default target initrd.target. May 27 18:07:19.246549 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 18:07:19.246557 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 18:07:19.246567 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 18:07:19.246576 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 18:07:19.246589 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 18:07:19.246598 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 18:07:19.246607 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 18:07:19.246616 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 18:07:19.246626 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 18:07:19.246635 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 18:07:19.246643 systemd[1]: Reached target paths.target - Path Units. May 27 18:07:19.246651 systemd[1]: Reached target slices.target - Slice Units. May 27 18:07:19.246659 systemd[1]: Reached target swap.target - Swaps. May 27 18:07:19.246667 systemd[1]: Reached target timers.target - Timer Units. May 27 18:07:19.246675 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 18:07:19.246683 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 18:07:19.246691 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 18:07:19.246701 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 18:07:19.246709 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 18:07:19.246717 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 18:07:19.246725 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 18:07:19.246733 systemd[1]: Reached target sockets.target - Socket Units. May 27 18:07:19.246741 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 18:07:19.246749 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 18:07:19.246757 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 18:07:19.246766 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 18:07:19.246776 systemd[1]: Starting systemd-fsck-usr.service... May 27 18:07:19.246784 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 18:07:19.246792 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 18:07:19.246800 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:07:19.246827 systemd-journald[911]: Collecting audit messages is disabled. May 27 18:07:19.246848 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 18:07:19.246857 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 18:07:19.246865 kernel: Bridge firewalling registered May 27 18:07:19.246874 systemd-journald[911]: Journal started May 27 18:07:19.246894 systemd-journald[911]: Runtime Journal (/run/log/journal/860552eb69c14bdbb3c37735724db508) is 8M, max 4G, 3.9G free. May 27 18:07:19.180174 systemd-modules-load[913]: Inserted module 'overlay' May 27 18:07:19.268839 systemd[1]: Started systemd-journald.service - Journal Service. May 27 18:07:19.236430 systemd-modules-load[913]: Inserted module 'br_netfilter' May 27 18:07:19.274586 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 18:07:19.285667 systemd[1]: Finished systemd-fsck-usr.service. May 27 18:07:19.296844 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 18:07:19.308666 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:07:19.322568 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 18:07:19.330659 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 18:07:19.362138 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 18:07:19.368954 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 18:07:19.381405 systemd-tmpfiles[944]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 18:07:19.398464 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 18:07:19.414785 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 18:07:19.432657 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 18:07:19.443363 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 18:07:19.463270 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 18:07:19.499796 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 18:07:19.513286 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 18:07:19.526321 dracut-cmdline[964]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=packet flatcar.autologin verity.usrhash=4e706b869299e1c88703222069cdfa08c45ebce568f762053eea5b3f5f0939c3 May 27 18:07:19.535598 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 18:07:19.539320 systemd-resolved[966]: Positive Trust Anchors: May 27 18:07:19.539329 systemd-resolved[966]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 18:07:19.539360 systemd-resolved[966]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 18:07:19.554894 systemd-resolved[966]: Defaulting to hostname 'linux'. May 27 18:07:19.572255 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 18:07:19.592655 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 18:07:19.696596 kernel: SCSI subsystem initialized May 27 18:07:19.712588 kernel: Loading iSCSI transport class v2.0-870. May 27 18:07:19.731588 kernel: iscsi: registered transport (tcp) May 27 18:07:19.759167 kernel: iscsi: registered transport (qla4xxx) May 27 18:07:19.759188 kernel: QLogic iSCSI HBA Driver May 27 18:07:19.777719 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 18:07:19.805664 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 18:07:19.822281 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 18:07:19.872340 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 18:07:19.884011 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 18:07:19.970593 kernel: raid6: neonx8 gen() 15849 MB/s May 27 18:07:19.996590 kernel: raid6: neonx4 gen() 15877 MB/s May 27 18:07:20.022590 kernel: raid6: neonx2 gen() 13315 MB/s May 27 18:07:20.047589 kernel: raid6: neonx1 gen() 10589 MB/s May 27 18:07:20.072589 kernel: raid6: int64x8 gen() 6925 MB/s May 27 18:07:20.097589 kernel: raid6: int64x4 gen() 7391 MB/s May 27 18:07:20.122589 kernel: raid6: int64x2 gen() 6130 MB/s May 27 18:07:20.151022 kernel: raid6: int64x1 gen() 5077 MB/s May 27 18:07:20.151048 kernel: raid6: using algorithm neonx4 gen() 15877 MB/s May 27 18:07:20.185451 kernel: raid6: .... xor() 12442 MB/s, rmw enabled May 27 18:07:20.185472 kernel: raid6: using neon recovery algorithm May 27 18:07:20.210066 kernel: xor: measuring software checksum speed May 27 18:07:20.210088 kernel: 8regs : 21658 MB/sec May 27 18:07:20.218455 kernel: 32regs : 21687 MB/sec May 27 18:07:20.226724 kernel: arm64_neon : 28215 MB/sec May 27 18:07:20.234810 kernel: xor: using function: arm64_neon (28215 MB/sec) May 27 18:07:20.301590 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 18:07:20.308605 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 18:07:20.315371 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 18:07:20.354673 systemd-udevd[1191]: Using default interface naming scheme 'v255'. May 27 18:07:20.358672 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 18:07:20.364923 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 18:07:20.401426 dracut-pre-trigger[1201]: rd.md=0: removing MD RAID activation May 27 18:07:20.424603 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 18:07:20.430680 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 18:07:20.733591 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 18:07:20.854301 kernel: pps_core: LinuxPPS API ver. 1 registered May 27 18:07:20.854333 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti May 27 18:07:20.854360 kernel: ACPI: bus type USB registered May 27 18:07:20.854378 kernel: PTP clock support registered May 27 18:07:20.854396 kernel: usbcore: registered new interface driver usbfs May 27 18:07:20.854406 kernel: usbcore: registered new interface driver hub May 27 18:07:20.854415 kernel: nvme 0005:03:00.0: Adding to iommu group 31 May 27 18:07:20.854564 kernel: usbcore: registered new device driver usb May 27 18:07:20.854575 kernel: nvme nvme0: pci function 0005:03:00.0 May 27 18:07:20.854675 kernel: nvme 0005:04:00.0: Adding to iommu group 32 May 27 18:07:20.854768 kernel: nvme nvme1: pci function 0005:04:00.0 May 27 18:07:20.854850 kernel: nvme nvme0: D3 entry latency set to 8 seconds May 27 18:07:20.854916 kernel: nvme nvme1: D3 entry latency set to 8 seconds May 27 18:07:20.873035 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 18:07:20.908266 kernel: nvme nvme1: 32/0/0 default/read/poll queues May 27 18:07:20.908375 kernel: nvme nvme0: 32/0/0 default/read/poll queues May 27 18:07:20.880515 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 18:07:21.040716 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 18:07:21.040738 kernel: GPT:9289727 != 1875385007 May 27 18:07:21.040749 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 18:07:21.040758 kernel: GPT:9289727 != 1875385007 May 27 18:07:21.040767 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 18:07:21.040776 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 18:07:21.040785 kernel: xhci_hcd 0004:03:00.0: Adding to iommu group 33 May 27 18:07:21.040906 kernel: xhci_hcd 0004:03:00.0: xHCI Host Controller May 27 18:07:21.040986 kernel: xhci_hcd 0004:03:00.0: new USB bus registered, assigned bus number 1 May 27 18:07:21.041062 kernel: xhci_hcd 0004:03:00.0: Zeroing 64bit base registers, expecting fault May 27 18:07:21.041139 kernel: igb: Intel(R) Gigabit Ethernet Network Driver May 27 18:07:21.041149 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. May 27 18:07:21.041158 kernel: igb 0003:03:00.0: Adding to iommu group 34 May 27 18:07:20.880585 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:07:21.077520 kernel: mlx5_core 0001:01:00.0: Adding to iommu group 35 May 27 18:07:20.972449 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:07:21.062693 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:07:21.082775 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 18:07:21.102702 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 18:07:21.125413 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:07:21.149317 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - SAMSUNG MZ1LB960HAJQ-00007 EFI-SYSTEM. May 27 18:07:21.167765 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - SAMSUNG MZ1LB960HAJQ-00007 ROOT. May 27 18:07:21.180899 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - SAMSUNG MZ1LB960HAJQ-00007 OEM. May 27 18:07:21.196836 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - SAMSUNG MZ1LB960HAJQ-00007 USR-A. May 27 18:07:21.208269 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - SAMSUNG MZ1LB960HAJQ-00007 USR-A. May 27 18:07:21.371876 kernel: xhci_hcd 0004:03:00.0: hcc params 0x014051cf hci version 0x100 quirks 0x0000000100000010 May 27 18:07:21.372024 kernel: xhci_hcd 0004:03:00.0: xHCI Host Controller May 27 18:07:21.372104 kernel: xhci_hcd 0004:03:00.0: new USB bus registered, assigned bus number 2 May 27 18:07:21.372180 kernel: xhci_hcd 0004:03:00.0: Host supports USB 3.0 SuperSpeed May 27 18:07:21.372257 kernel: hub 1-0:1.0: USB hub found May 27 18:07:21.372350 kernel: hub 1-0:1.0: 4 ports detected May 27 18:07:21.372425 kernel: mlx5_core 0001:01:00.0: PTM is not supported by PCIe May 27 18:07:21.372508 kernel: mlx5_core 0001:01:00.0: firmware version: 14.31.1014 May 27 18:07:21.372591 kernel: mlx5_core 0001:01:00.0: 31.504 Gb/s available PCIe bandwidth, limited by 8.0 GT/s PCIe x4 link at 0001:00:01.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) May 27 18:07:21.372668 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 27 18:07:21.289182 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 18:07:21.377460 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 18:07:21.394254 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 18:07:21.417169 kernel: igb 0003:03:00.0: added PHC on eth0 May 27 18:07:21.417291 kernel: igb 0003:03:00.0: Intel(R) Gigabit Ethernet Network Connection May 27 18:07:21.406534 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 18:07:21.532515 kernel: igb 0003:03:00.0: eth0: (PCIe:5.0Gb/s:Width x2) 18:c0:4d:80:53:56 May 27 18:07:21.532629 kernel: hub 2-0:1.0: USB hub found May 27 18:07:21.532736 kernel: hub 2-0:1.0: 4 ports detected May 27 18:07:21.532818 kernel: igb 0003:03:00.0: eth0: PBA No: 106300-000 May 27 18:07:21.532894 kernel: igb 0003:03:00.0: Using MSI-X interrupts. 8 rx queue(s), 8 tx queue(s) May 27 18:07:21.532968 kernel: igb 0003:03:00.1: Adding to iommu group 36 May 27 18:07:21.533050 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 18:07:21.533060 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 18:07:21.522925 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 18:07:21.542479 disk-uuid[1326]: Primary Header is updated. May 27 18:07:21.542479 disk-uuid[1326]: Secondary Entries is updated. May 27 18:07:21.542479 disk-uuid[1326]: Secondary Header is updated. May 27 18:07:21.629177 kernel: igb 0003:03:00.1: added PHC on eth1 May 27 18:07:21.629305 kernel: igb 0003:03:00.1: Intel(R) Gigabit Ethernet Network Connection May 27 18:07:21.629389 kernel: igb 0003:03:00.1: eth1: (PCIe:5.0Gb/s:Width x2) 18:c0:4d:80:53:57 May 27 18:07:21.629464 kernel: igb 0003:03:00.1: eth1: PBA No: 106300-000 May 27 18:07:21.629538 kernel: igb 0003:03:00.1: Using MSI-X interrupts. 8 rx queue(s), 8 tx queue(s) May 27 18:07:21.629617 kernel: igb 0003:03:00.0 eno1: renamed from eth0 May 27 18:07:21.629694 kernel: igb 0003:03:00.1 eno2: renamed from eth1 May 27 18:07:21.567614 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 18:07:21.667592 kernel: mlx5_core 0001:01:00.0: E-Switch: Total vports 2, per vport: max uc(128) max mc(2048) May 27 18:07:21.683179 kernel: mlx5_core 0001:01:00.0: Port module event: module 0, Cable plugged May 27 18:07:21.700588 kernel: usb 1-3: new high-speed USB device number 2 using xhci_hcd May 27 18:07:21.842590 kernel: hub 1-3:1.0: USB hub found May 27 18:07:21.851584 kernel: hub 1-3:1.0: 4 ports detected May 27 18:07:21.952594 kernel: usb 2-3: new SuperSpeed USB device number 2 using xhci_hcd May 27 18:07:21.980591 kernel: hub 2-3:1.0: USB hub found May 27 18:07:21.989584 kernel: hub 2-3:1.0: 4 ports detected May 27 18:07:21.989682 kernel: mlx5_core 0001:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) May 27 18:07:22.013584 kernel: mlx5_core 0001:01:00.1: Adding to iommu group 37 May 27 18:07:22.028816 kernel: mlx5_core 0001:01:00.1: PTM is not supported by PCIe May 27 18:07:22.028981 kernel: mlx5_core 0001:01:00.1: firmware version: 14.31.1014 May 27 18:07:22.042682 kernel: mlx5_core 0001:01:00.1: 31.504 Gb/s available PCIe bandwidth, limited by 8.0 GT/s PCIe x4 link at 0001:00:01.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) May 27 18:07:22.386587 kernel: mlx5_core 0001:01:00.1: E-Switch: Total vports 2, per vport: max uc(128) max mc(2048) May 27 18:07:22.403307 kernel: mlx5_core 0001:01:00.1: Port module event: module 1, Cable plugged May 27 18:07:22.511532 disk-uuid[1327]: The operation has completed successfully. May 27 18:07:22.516209 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 18:07:22.740594 kernel: mlx5_core 0001:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) May 27 18:07:22.754595 kernel: mlx5_core 0001:01:00.1 enP1p1s0f1np1: renamed from eth1 May 27 18:07:22.754767 kernel: mlx5_core 0001:01:00.0 enP1p1s0f0np0: renamed from eth0 May 27 18:07:22.800454 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 18:07:22.800548 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 18:07:22.806050 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 18:07:22.828235 sh[1539]: Success May 27 18:07:22.865687 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 18:07:22.865725 kernel: device-mapper: uevent: version 1.0.3 May 27 18:07:22.874966 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 18:07:22.902589 kernel: device-mapper: verity: sha256 using shash "sha256-ce" May 27 18:07:22.932375 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 18:07:22.943437 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 18:07:22.969601 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 18:07:22.977586 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 18:07:22.977603 kernel: BTRFS: device fsid 3c8c76ef-f1da-40fe-979d-11bdf765e403 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (1554) May 27 18:07:22.978584 kernel: BTRFS info (device dm-0): first mount of filesystem 3c8c76ef-f1da-40fe-979d-11bdf765e403 May 27 18:07:22.978595 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 27 18:07:22.978604 kernel: BTRFS info (device dm-0): using free-space-tree May 27 18:07:23.064183 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 18:07:23.070348 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 18:07:23.080526 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 18:07:23.081530 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 18:07:23.106116 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 18:07:23.216561 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 (259:6) scanned by mount (1578) May 27 18:07:23.216577 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 18:07:23.216599 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 18:07:23.216609 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 18:07:23.216618 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 18:07:23.217894 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 18:07:23.230042 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 18:07:23.248502 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 18:07:23.271949 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 18:07:23.309208 systemd-networkd[1733]: lo: Link UP May 27 18:07:23.309214 systemd-networkd[1733]: lo: Gained carrier May 27 18:07:23.312816 systemd-networkd[1733]: Enumeration completed May 27 18:07:23.312932 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 18:07:23.313887 systemd-networkd[1733]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 18:07:23.327485 systemd[1]: Reached target network.target - Network. May 27 18:07:23.365125 systemd-networkd[1733]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 18:07:23.392101 ignition[1729]: Ignition 2.21.0 May 27 18:07:23.392108 ignition[1729]: Stage: fetch-offline May 27 18:07:23.392135 ignition[1729]: no configs at "/usr/lib/ignition/base.d" May 27 18:07:23.399545 unknown[1729]: fetched base config from "system" May 27 18:07:23.392142 ignition[1729]: no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 18:07:23.399552 unknown[1729]: fetched user config from "system" May 27 18:07:23.392312 ignition[1729]: parsed url from cmdline: "" May 27 18:07:23.402384 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 18:07:23.392314 ignition[1729]: no config URL provided May 27 18:07:23.409334 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 27 18:07:23.392318 ignition[1729]: reading system config file "/usr/lib/ignition/user.ign" May 27 18:07:23.410469 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 18:07:23.392369 ignition[1729]: parsing config with SHA512: 28de1a93fb6122acadd246328548c22761587e1e97cf357f8417e5c267856d0f7892edc853f1ddf2dbf3f7b6e1578a24c750dafd6c7fd00ad866b1e8a32fcb7e May 27 18:07:23.418709 systemd-networkd[1733]: enP1p1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 18:07:23.399869 ignition[1729]: fetch-offline: fetch-offline passed May 27 18:07:23.399874 ignition[1729]: POST message to Packet Timeline May 27 18:07:23.399879 ignition[1729]: POST Status error: resource requires networking May 27 18:07:23.399925 ignition[1729]: Ignition finished successfully May 27 18:07:23.455942 ignition[1776]: Ignition 2.21.0 May 27 18:07:23.455948 ignition[1776]: Stage: kargs May 27 18:07:23.456173 ignition[1776]: no configs at "/usr/lib/ignition/base.d" May 27 18:07:23.456181 ignition[1776]: no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 18:07:23.461117 ignition[1776]: kargs: kargs passed May 27 18:07:23.461123 ignition[1776]: POST message to Packet Timeline May 27 18:07:23.461335 ignition[1776]: GET https://metadata.packet.net/metadata: attempt #1 May 27 18:07:23.465018 ignition[1776]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:45338->[::1]:53: read: connection refused May 27 18:07:23.666022 ignition[1776]: GET https://metadata.packet.net/metadata: attempt #2 May 27 18:07:23.666630 ignition[1776]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:41219->[::1]:53: read: connection refused May 27 18:07:24.042597 kernel: mlx5_core 0001:01:00.0 enP1p1s0f0np0: Link up May 27 18:07:24.045110 systemd-networkd[1733]: enP1p1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 18:07:24.067280 ignition[1776]: GET https://metadata.packet.net/metadata: attempt #3 May 27 18:07:24.067742 ignition[1776]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:54689->[::1]:53: read: connection refused May 27 18:07:24.668596 kernel: mlx5_core 0001:01:00.1 enP1p1s0f1np1: Link up May 27 18:07:24.671640 systemd-networkd[1733]: eno1: Link UP May 27 18:07:24.671773 systemd-networkd[1733]: eno2: Link UP May 27 18:07:24.671889 systemd-networkd[1733]: enP1p1s0f0np0: Link UP May 27 18:07:24.672022 systemd-networkd[1733]: enP1p1s0f0np0: Gained carrier May 27 18:07:24.681815 systemd-networkd[1733]: enP1p1s0f1np1: Link UP May 27 18:07:24.683101 systemd-networkd[1733]: enP1p1s0f1np1: Gained carrier May 27 18:07:24.706623 systemd-networkd[1733]: enP1p1s0f0np0: DHCPv4 address 147.28.228.207/31, gateway 147.28.228.206 acquired from 147.28.144.140 May 27 18:07:24.867978 ignition[1776]: GET https://metadata.packet.net/metadata: attempt #4 May 27 18:07:24.868889 ignition[1776]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:54098->[::1]:53: read: connection refused May 27 18:07:25.681808 systemd-networkd[1733]: enP1p1s0f0np0: Gained IPv6LL May 27 18:07:26.470529 ignition[1776]: GET https://metadata.packet.net/metadata: attempt #5 May 27 18:07:26.471525 ignition[1776]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:57428->[::1]:53: read: connection refused May 27 18:07:26.577813 systemd-networkd[1733]: enP1p1s0f1np1: Gained IPv6LL May 27 18:07:29.674219 ignition[1776]: GET https://metadata.packet.net/metadata: attempt #6 May 27 18:07:30.181905 ignition[1776]: GET result: OK May 27 18:07:30.485551 ignition[1776]: Ignition finished successfully May 27 18:07:30.490649 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 18:07:30.493736 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 18:07:30.533775 ignition[1814]: Ignition 2.21.0 May 27 18:07:30.533783 ignition[1814]: Stage: disks May 27 18:07:30.533922 ignition[1814]: no configs at "/usr/lib/ignition/base.d" May 27 18:07:30.533931 ignition[1814]: no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 18:07:30.536352 ignition[1814]: disks: disks passed May 27 18:07:30.536358 ignition[1814]: POST message to Packet Timeline May 27 18:07:30.536378 ignition[1814]: GET https://metadata.packet.net/metadata: attempt #1 May 27 18:07:31.549436 ignition[1814]: GET result: OK May 27 18:07:31.870089 ignition[1814]: Ignition finished successfully May 27 18:07:31.874631 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 18:07:31.879054 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 18:07:31.886640 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 18:07:31.894698 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 18:07:31.903214 systemd[1]: Reached target sysinit.target - System Initialization. May 27 18:07:31.912107 systemd[1]: Reached target basic.target - Basic System. May 27 18:07:31.922514 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 18:07:31.956356 systemd-fsck[1836]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 27 18:07:31.959586 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 18:07:31.967797 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 18:07:32.070588 kernel: EXT4-fs (nvme0n1p9): mounted filesystem a5483afc-8426-4c3e-85ef-8146f9077e7d r/w with ordered data mode. Quota mode: none. May 27 18:07:32.070725 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 18:07:32.081180 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 18:07:32.092213 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 18:07:32.116207 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 18:07:32.124584 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/nvme0n1p6 (259:6) scanned by mount (1847) May 27 18:07:32.124606 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 18:07:32.124616 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 18:07:32.124626 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 18:07:32.192996 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 27 18:07:32.217878 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... May 27 18:07:32.229876 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 18:07:32.229906 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 18:07:32.250194 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 18:07:32.258616 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 18:07:32.272369 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 18:07:32.286928 coreos-metadata[1864]: May 27 18:07:32.267 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 May 27 18:07:32.298109 coreos-metadata[1865]: May 27 18:07:32.267 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 May 27 18:07:32.325351 initrd-setup-root[1888]: cut: /sysroot/etc/passwd: No such file or directory May 27 18:07:32.331854 initrd-setup-root[1895]: cut: /sysroot/etc/group: No such file or directory May 27 18:07:32.338292 initrd-setup-root[1903]: cut: /sysroot/etc/shadow: No such file or directory May 27 18:07:32.344787 initrd-setup-root[1910]: cut: /sysroot/etc/gshadow: No such file or directory May 27 18:07:32.413915 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 18:07:32.421594 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 18:07:32.448872 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 18:07:32.481825 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 18:07:32.459003 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 18:07:32.492455 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 18:07:32.500661 ignition[1980]: INFO : Ignition 2.21.0 May 27 18:07:32.500661 ignition[1980]: INFO : Stage: mount May 27 18:07:32.515276 ignition[1980]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 18:07:32.515276 ignition[1980]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 18:07:32.515276 ignition[1980]: INFO : mount: mount passed May 27 18:07:32.515276 ignition[1980]: INFO : POST message to Packet Timeline May 27 18:07:32.515276 ignition[1980]: INFO : GET https://metadata.packet.net/metadata: attempt #1 May 27 18:07:32.755185 coreos-metadata[1865]: May 27 18:07:32.755 INFO Fetch successful May 27 18:07:32.802509 systemd[1]: flatcar-static-network.service: Deactivated successfully. May 27 18:07:32.803698 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. May 27 18:07:32.842107 coreos-metadata[1864]: May 27 18:07:32.842 INFO Fetch successful May 27 18:07:32.883794 coreos-metadata[1864]: May 27 18:07:32.883 INFO wrote hostname ci-4344.0.0-a-e5d745d36c to /sysroot/etc/hostname May 27 18:07:32.888667 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 18:07:32.966102 ignition[1980]: INFO : GET result: OK May 27 18:07:33.467276 ignition[1980]: INFO : Ignition finished successfully May 27 18:07:33.471718 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 18:07:33.478503 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 18:07:33.506689 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 18:07:33.548191 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/nvme0n1p6 (259:6) scanned by mount (2010) May 27 18:07:33.548229 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 0631e8fb-ef71-4ba1-b2b8-88386996a754 May 27 18:07:33.562686 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm May 27 18:07:33.575902 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 18:07:33.584788 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 18:07:33.629281 ignition[2027]: INFO : Ignition 2.21.0 May 27 18:07:33.629281 ignition[2027]: INFO : Stage: files May 27 18:07:33.639218 ignition[2027]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 18:07:33.639218 ignition[2027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 18:07:33.639218 ignition[2027]: DEBUG : files: compiled without relabeling support, skipping May 27 18:07:33.639218 ignition[2027]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 18:07:33.639218 ignition[2027]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 18:07:33.639218 ignition[2027]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 18:07:33.639218 ignition[2027]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 18:07:33.639218 ignition[2027]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 18:07:33.639218 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" May 27 18:07:33.639218 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 May 27 18:07:33.634298 unknown[2027]: wrote ssh authorized keys file for user: core May 27 18:07:33.736142 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 18:07:33.784263 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" May 27 18:07:33.795248 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 May 27 18:07:34.144705 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 18:07:34.529997 ignition[2027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" May 27 18:07:34.529997 ignition[2027]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 18:07:34.554723 ignition[2027]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 18:07:34.554723 ignition[2027]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 18:07:34.554723 ignition[2027]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 18:07:34.554723 ignition[2027]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 27 18:07:34.554723 ignition[2027]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 27 18:07:34.554723 ignition[2027]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 18:07:34.554723 ignition[2027]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 18:07:34.554723 ignition[2027]: INFO : files: files passed May 27 18:07:34.554723 ignition[2027]: INFO : POST message to Packet Timeline May 27 18:07:34.554723 ignition[2027]: INFO : GET https://metadata.packet.net/metadata: attempt #1 May 27 18:07:34.985439 ignition[2027]: INFO : GET result: OK May 27 18:07:35.337411 ignition[2027]: INFO : Ignition finished successfully May 27 18:07:35.342647 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 18:07:35.350774 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 18:07:35.380224 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 18:07:35.398961 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 18:07:35.399149 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 18:07:35.416973 initrd-setup-root-after-ignition[2071]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 18:07:35.416973 initrd-setup-root-after-ignition[2071]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 18:07:35.411452 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 18:07:35.467971 initrd-setup-root-after-ignition[2075]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 18:07:35.424404 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 18:07:35.440928 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 18:07:35.499517 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 18:07:35.501663 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 18:07:35.511307 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 18:07:35.527394 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 18:07:35.538531 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 18:07:35.539417 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 18:07:35.572341 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 18:07:35.585010 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 18:07:35.616622 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 18:07:35.628424 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 18:07:35.634301 systemd[1]: Stopped target timers.target - Timer Units. May 27 18:07:35.645855 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 18:07:35.645958 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 18:07:35.657518 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 18:07:35.668757 systemd[1]: Stopped target basic.target - Basic System. May 27 18:07:35.680199 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 18:07:35.691639 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 18:07:35.702882 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 18:07:35.714113 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 18:07:35.725437 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 18:07:35.736693 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 18:07:35.747963 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 18:07:35.759330 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 18:07:35.776181 systemd[1]: Stopped target swap.target - Swaps. May 27 18:07:35.787494 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 18:07:35.787597 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 18:07:35.798916 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 18:07:35.810041 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 18:07:35.821082 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 18:07:35.824610 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 18:07:35.832276 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 18:07:35.832369 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 18:07:35.843792 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 18:07:35.843880 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 18:07:35.855094 systemd[1]: Stopped target paths.target - Path Units. May 27 18:07:35.866428 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 18:07:35.871609 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 18:07:35.883552 systemd[1]: Stopped target slices.target - Slice Units. May 27 18:07:35.895082 systemd[1]: Stopped target sockets.target - Socket Units. May 27 18:07:35.906701 systemd[1]: iscsid.socket: Deactivated successfully. May 27 18:07:35.906783 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 18:07:35.918352 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 18:07:35.918442 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 18:07:36.024536 ignition[2097]: INFO : Ignition 2.21.0 May 27 18:07:36.024536 ignition[2097]: INFO : Stage: umount May 27 18:07:36.024536 ignition[2097]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 18:07:36.024536 ignition[2097]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" May 27 18:07:36.024536 ignition[2097]: INFO : umount: umount passed May 27 18:07:36.024536 ignition[2097]: INFO : POST message to Packet Timeline May 27 18:07:36.024536 ignition[2097]: INFO : GET https://metadata.packet.net/metadata: attempt #1 May 27 18:07:35.930075 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 18:07:35.930161 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 18:07:35.941759 systemd[1]: ignition-files.service: Deactivated successfully. May 27 18:07:35.941841 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 18:07:35.953426 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 27 18:07:35.953507 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 18:07:35.971619 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 18:07:35.997136 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 18:07:36.006145 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 18:07:36.006265 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 18:07:36.018715 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 18:07:36.018796 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 18:07:36.033267 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 18:07:36.034119 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 18:07:36.035609 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 18:07:36.044110 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 18:07:36.045601 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 18:07:36.492892 ignition[2097]: INFO : GET result: OK May 27 18:07:36.747012 ignition[2097]: INFO : Ignition finished successfully May 27 18:07:36.750721 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 18:07:36.751702 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 18:07:36.757105 systemd[1]: Stopped target network.target - Network. May 27 18:07:36.766137 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 18:07:36.766212 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 18:07:36.775507 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 18:07:36.775549 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 18:07:36.785045 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 18:07:36.785113 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 18:07:36.794572 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 18:07:36.794603 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 18:07:36.804191 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 18:07:36.804239 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 18:07:36.813913 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 18:07:36.823538 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 18:07:36.833683 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 18:07:36.834674 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 18:07:36.847327 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 18:07:36.848242 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 18:07:36.848348 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 18:07:36.860964 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 18:07:36.863734 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 18:07:36.863908 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 18:07:36.874440 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 18:07:36.874809 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 18:07:36.883168 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 18:07:36.883210 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 18:07:36.894018 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 18:07:36.903893 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 18:07:36.903946 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 18:07:36.914401 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 18:07:36.914439 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 18:07:36.930426 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 18:07:36.930479 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 18:07:36.941113 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 18:07:36.958266 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 18:07:36.958652 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 18:07:36.958778 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 18:07:36.976333 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 18:07:36.976468 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 18:07:36.985500 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 18:07:36.985570 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 18:07:36.996568 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 18:07:36.996628 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 18:07:37.013644 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 18:07:37.013681 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 18:07:37.024918 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 18:07:37.024983 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 18:07:37.037296 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 18:07:37.048003 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 18:07:37.048078 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 18:07:37.059847 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 18:07:37.059885 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 18:07:37.071805 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 18:07:37.071843 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 18:07:37.089442 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 18:07:37.089477 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 18:07:37.101266 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 18:07:37.101298 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:07:37.115627 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 18:07:37.115684 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 27 18:07:37.115711 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 18:07:37.115745 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 18:07:37.116048 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 18:07:37.116121 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 18:07:37.638740 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 18:07:37.639680 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 18:07:37.650290 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 18:07:37.661375 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 18:07:37.690984 systemd[1]: Switching root. May 27 18:07:37.758479 systemd-journald[911]: Journal stopped May 27 18:07:39.936316 systemd-journald[911]: Received SIGTERM from PID 1 (systemd). May 27 18:07:39.936343 kernel: SELinux: policy capability network_peer_controls=1 May 27 18:07:39.936353 kernel: SELinux: policy capability open_perms=1 May 27 18:07:39.936361 kernel: SELinux: policy capability extended_socket_class=1 May 27 18:07:39.936368 kernel: SELinux: policy capability always_check_network=0 May 27 18:07:39.936376 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 18:07:39.936384 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 18:07:39.936393 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 18:07:39.936400 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 18:07:39.936407 kernel: SELinux: policy capability userspace_initial_context=0 May 27 18:07:39.936415 kernel: audit: type=1403 audit(1748369257.961:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 18:07:39.936424 systemd[1]: Successfully loaded SELinux policy in 141.108ms. May 27 18:07:39.936433 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.601ms. May 27 18:07:39.936442 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 18:07:39.936453 systemd[1]: Detected architecture arm64. May 27 18:07:39.936461 systemd[1]: Detected first boot. May 27 18:07:39.936470 systemd[1]: Hostname set to . May 27 18:07:39.936479 systemd[1]: Initializing machine ID from random generator. May 27 18:07:39.936487 zram_generator::config[2167]: No configuration found. May 27 18:07:39.936497 systemd[1]: Populated /etc with preset unit settings. May 27 18:07:39.936506 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 18:07:39.936515 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 18:07:39.936523 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 18:07:39.936531 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 18:07:39.936540 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 18:07:39.936548 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 18:07:39.936558 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 18:07:39.936567 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 18:07:39.936576 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 18:07:39.936587 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 18:07:39.936596 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 18:07:39.936605 systemd[1]: Created slice user.slice - User and Session Slice. May 27 18:07:39.936614 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 18:07:39.936623 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 18:07:39.936633 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 18:07:39.936642 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 18:07:39.936650 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 18:07:39.936659 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 18:07:39.936667 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... May 27 18:07:39.936676 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 18:07:39.936687 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 18:07:39.936695 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 18:07:39.936705 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 18:07:39.936714 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 18:07:39.936723 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 18:07:39.936732 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 18:07:39.936741 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 18:07:39.936749 systemd[1]: Reached target slices.target - Slice Units. May 27 18:07:39.936758 systemd[1]: Reached target swap.target - Swaps. May 27 18:07:39.936770 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 18:07:39.936779 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 18:07:39.936788 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 18:07:39.936797 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 18:07:39.936806 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 18:07:39.936816 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 18:07:39.936825 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 18:07:39.936834 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 18:07:39.936843 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 18:07:39.936852 systemd[1]: Mounting media.mount - External Media Directory... May 27 18:07:39.936861 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 18:07:39.936869 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 18:07:39.936878 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 18:07:39.936889 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 18:07:39.936898 systemd[1]: Reached target machines.target - Containers. May 27 18:07:39.936907 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 18:07:39.936916 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 18:07:39.936925 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 18:07:39.936933 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 18:07:39.936942 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 18:07:39.936951 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 18:07:39.936960 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 18:07:39.936969 kernel: ACPI: bus type drm_connector registered May 27 18:07:39.936978 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 18:07:39.936987 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 18:07:39.936995 kernel: fuse: init (API version 7.41) May 27 18:07:39.937003 kernel: loop: module loaded May 27 18:07:39.937012 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 18:07:39.937020 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 18:07:39.937029 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 18:07:39.937039 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 18:07:39.937048 systemd[1]: Stopped systemd-fsck-usr.service. May 27 18:07:39.937057 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 18:07:39.937066 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 18:07:39.937075 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 18:07:39.937084 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 18:07:39.937093 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 18:07:39.937120 systemd-journald[2270]: Collecting audit messages is disabled. May 27 18:07:39.937141 systemd-journald[2270]: Journal started May 27 18:07:39.937160 systemd-journald[2270]: Runtime Journal (/run/log/journal/c86a8443ab5e47cb89535d93907de6d0) is 8M, max 4G, 3.9G free. May 27 18:07:38.504949 systemd[1]: Queued start job for default target multi-user.target. May 27 18:07:38.534146 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. May 27 18:07:38.534477 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 18:07:38.534761 systemd[1]: systemd-journald.service: Consumed 3.505s CPU time. May 27 18:07:39.972596 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 18:07:39.993596 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 18:07:40.016829 systemd[1]: verity-setup.service: Deactivated successfully. May 27 18:07:40.016847 systemd[1]: Stopped verity-setup.service. May 27 18:07:40.042591 systemd[1]: Started systemd-journald.service - Journal Service. May 27 18:07:40.047973 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 18:07:40.053574 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 18:07:40.059298 systemd[1]: Mounted media.mount - External Media Directory. May 27 18:07:40.064744 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 18:07:40.070228 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 18:07:40.075666 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 18:07:40.081170 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 18:07:40.086693 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 18:07:40.093352 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 18:07:40.093524 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 18:07:40.098902 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 18:07:40.099052 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 18:07:40.106431 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 18:07:40.106641 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 18:07:40.112013 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 18:07:40.113617 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 18:07:40.119157 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 18:07:40.119320 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 18:07:40.124555 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 18:07:40.124717 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 18:07:40.130625 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 18:07:40.135752 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 18:07:40.140876 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 18:07:40.147118 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 18:07:40.162100 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 18:07:40.168225 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 18:07:40.190370 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 18:07:40.195351 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 18:07:40.195381 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 18:07:40.201023 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 18:07:40.206859 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 18:07:40.211797 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 18:07:40.234196 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 18:07:40.239842 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 18:07:40.244711 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 18:07:40.245689 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 18:07:40.250471 systemd-journald[2270]: Time spent on flushing to /var/log/journal/c86a8443ab5e47cb89535d93907de6d0 is 25.105ms for 2478 entries. May 27 18:07:40.250471 systemd-journald[2270]: System Journal (/var/log/journal/c86a8443ab5e47cb89535d93907de6d0) is 8M, max 195.6M, 187.6M free. May 27 18:07:40.284014 systemd-journald[2270]: Received client request to flush runtime journal. May 27 18:07:40.284162 kernel: loop0: detected capacity change from 0 to 203944 May 27 18:07:40.250495 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 18:07:40.251656 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 18:07:40.268752 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 18:07:40.274560 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 18:07:40.280811 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 18:07:40.297550 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 18:07:40.307586 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 18:07:40.309356 systemd-tmpfiles[2306]: ACLs are not supported, ignoring. May 27 18:07:40.309369 systemd-tmpfiles[2306]: ACLs are not supported, ignoring. May 27 18:07:40.311683 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 18:07:40.316415 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 18:07:40.321361 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 18:07:40.326161 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 18:07:40.330880 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 18:07:40.339455 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 18:07:40.345542 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 18:07:40.365596 kernel: loop1: detected capacity change from 0 to 107312 May 27 18:07:40.370638 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 18:07:40.378100 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 18:07:40.378680 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 18:07:40.402173 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 18:07:40.408463 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 18:07:40.411586 kernel: loop2: detected capacity change from 0 to 138376 May 27 18:07:40.454815 systemd-tmpfiles[2331]: ACLs are not supported, ignoring. May 27 18:07:40.454828 systemd-tmpfiles[2331]: ACLs are not supported, ignoring. May 27 18:07:40.458261 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 18:07:40.482916 kernel: loop3: detected capacity change from 0 to 8 May 27 18:07:40.503637 ldconfig[2299]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 18:07:40.505136 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 18:07:40.538599 kernel: loop4: detected capacity change from 0 to 203944 May 27 18:07:40.557626 kernel: loop5: detected capacity change from 0 to 107312 May 27 18:07:40.574592 kernel: loop6: detected capacity change from 0 to 138376 May 27 18:07:40.592649 kernel: loop7: detected capacity change from 0 to 8 May 27 18:07:40.593080 (sd-merge)[2346]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. May 27 18:07:40.593510 (sd-merge)[2346]: Merged extensions into '/usr'. May 27 18:07:40.601438 systemd[1]: Reload requested from client PID 2305 ('systemd-sysext') (unit systemd-sysext.service)... May 27 18:07:40.601452 systemd[1]: Reloading... May 27 18:07:40.644587 zram_generator::config[2371]: No configuration found. May 27 18:07:40.721647 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 18:07:40.795601 systemd[1]: Reloading finished in 193 ms. May 27 18:07:40.824908 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 18:07:40.830200 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 18:07:40.857882 systemd[1]: Starting ensure-sysext.service... May 27 18:07:40.863833 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 18:07:40.870839 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 18:07:40.882267 systemd[1]: Reload requested from client PID 2424 ('systemctl') (unit ensure-sysext.service)... May 27 18:07:40.882278 systemd[1]: Reloading... May 27 18:07:40.882549 systemd-tmpfiles[2425]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 18:07:40.882572 systemd-tmpfiles[2425]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 18:07:40.882782 systemd-tmpfiles[2425]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 18:07:40.882957 systemd-tmpfiles[2425]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 18:07:40.883515 systemd-tmpfiles[2425]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 18:07:40.883712 systemd-tmpfiles[2425]: ACLs are not supported, ignoring. May 27 18:07:40.883753 systemd-tmpfiles[2425]: ACLs are not supported, ignoring. May 27 18:07:40.886738 systemd-tmpfiles[2425]: Detected autofs mount point /boot during canonicalization of boot. May 27 18:07:40.886745 systemd-tmpfiles[2425]: Skipping /boot May 27 18:07:40.895299 systemd-tmpfiles[2425]: Detected autofs mount point /boot during canonicalization of boot. May 27 18:07:40.895307 systemd-tmpfiles[2425]: Skipping /boot May 27 18:07:40.899465 systemd-udevd[2426]: Using default interface naming scheme 'v255'. May 27 18:07:40.929587 zram_generator::config[2459]: No configuration found. May 27 18:07:40.974594 kernel: IPMI message handler: version 39.2 May 27 18:07:40.984587 kernel: ipmi device interface May 27 18:07:40.996594 kernel: ipmi_ssif: IPMI SSIF Interface driver May 27 18:07:41.002599 kernel: MACsec IEEE 802.1AE May 27 18:07:41.002654 kernel: ipmi_si: IPMI System Interface driver May 27 18:07:41.018687 kernel: ipmi_si: Unable to find any System Interface(s) May 27 18:07:41.019667 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 18:07:41.112058 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. May 27 18:07:41.112307 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - SAMSUNG MZ1LB960HAJQ-00007 OEM. May 27 18:07:41.117140 systemd[1]: Reloading finished in 234 ms. May 27 18:07:41.137899 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 18:07:41.170360 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 18:07:41.193186 systemd[1]: Finished ensure-sysext.service. May 27 18:07:41.214987 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 18:07:41.241383 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 18:07:41.246418 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 18:07:41.247329 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 18:07:41.253253 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 18:07:41.259148 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 18:07:41.264979 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 18:07:41.269924 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 18:07:41.270822 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 18:07:41.275613 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 18:07:41.276735 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 18:07:41.281723 augenrules[2680]: No rules May 27 18:07:41.283433 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 18:07:41.290085 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 18:07:41.296319 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 27 18:07:41.301903 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 18:07:41.307350 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:07:41.312610 systemd[1]: audit-rules.service: Deactivated successfully. May 27 18:07:41.312814 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 18:07:41.317474 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 18:07:41.323185 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 18:07:41.323969 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 18:07:41.328554 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 18:07:41.328743 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 18:07:41.333212 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 18:07:41.333368 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 18:07:41.337969 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 18:07:41.338127 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 18:07:41.342754 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 18:07:41.348302 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 18:07:41.353159 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:07:41.365549 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 18:07:41.365678 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 18:07:41.366990 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 18:07:41.393996 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 18:07:41.398568 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 18:07:41.398989 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 18:07:41.404376 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 18:07:41.430697 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 18:07:41.492545 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 27 18:07:41.497360 systemd[1]: Reached target time-set.target - System Time Set. May 27 18:07:41.500662 systemd-resolved[2687]: Positive Trust Anchors: May 27 18:07:41.500675 systemd-resolved[2687]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 18:07:41.500706 systemd-resolved[2687]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 18:07:41.504098 systemd-resolved[2687]: Using system hostname 'ci-4344.0.0-a-e5d745d36c'. May 27 18:07:41.505434 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 18:07:41.508190 systemd-networkd[2686]: lo: Link UP May 27 18:07:41.508196 systemd-networkd[2686]: lo: Gained carrier May 27 18:07:41.509841 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 18:07:41.511788 systemd-networkd[2686]: bond0: netdev ready May 27 18:07:41.514226 systemd[1]: Reached target sysinit.target - System Initialization. May 27 18:07:41.518623 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 18:07:41.520931 systemd-networkd[2686]: Enumeration completed May 27 18:07:41.523007 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 18:07:41.527521 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 18:07:41.531943 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 18:07:41.533510 systemd-networkd[2686]: enP1p1s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:49:d0:ec.network. May 27 18:07:41.536256 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 18:07:41.540614 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 18:07:41.540636 systemd[1]: Reached target paths.target - Path Units. May 27 18:07:41.544877 systemd[1]: Reached target timers.target - Timer Units. May 27 18:07:41.549761 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 18:07:41.555487 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 18:07:41.561630 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 18:07:41.579645 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 18:07:41.584417 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 18:07:41.589267 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 18:07:41.593812 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 18:07:41.598336 systemd[1]: Reached target network.target - Network. May 27 18:07:41.602779 systemd[1]: Reached target sockets.target - Socket Units. May 27 18:07:41.607125 systemd[1]: Reached target basic.target - Basic System. May 27 18:07:41.611428 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 18:07:41.611448 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 18:07:41.612508 systemd[1]: Starting containerd.service - containerd container runtime... May 27 18:07:41.639244 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 18:07:41.644722 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 18:07:41.650233 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 18:07:41.655680 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 18:07:41.660695 coreos-metadata[2729]: May 27 18:07:41.660 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 May 27 18:07:41.661211 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 18:07:41.663893 coreos-metadata[2729]: May 27 18:07:41.663 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) May 27 18:07:41.665677 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 18:07:41.665804 jq[2734]: false May 27 18:07:41.666717 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 18:07:41.672305 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 18:07:41.677905 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 18:07:41.680831 extend-filesystems[2736]: Found loop4 May 27 18:07:41.687195 extend-filesystems[2736]: Found loop5 May 27 18:07:41.687195 extend-filesystems[2736]: Found loop6 May 27 18:07:41.687195 extend-filesystems[2736]: Found loop7 May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme1n1 May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme0n1 May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme0n1p1 May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme0n1p2 May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme0n1p3 May 27 18:07:41.687195 extend-filesystems[2736]: Found usr May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme0n1p4 May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme0n1p6 May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme0n1p7 May 27 18:07:41.687195 extend-filesystems[2736]: Found nvme0n1p9 May 27 18:07:41.687195 extend-filesystems[2736]: Checking size of /dev/nvme0n1p9 May 27 18:07:41.803162 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 233815889 blocks May 27 18:07:41.683745 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 18:07:41.803290 extend-filesystems[2736]: Resized partition /dev/nvme0n1p9 May 27 18:07:41.695821 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 18:07:41.807749 extend-filesystems[2756]: resize2fs 1.47.2 (1-Jan-2025) May 27 18:07:41.702048 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 18:07:41.738647 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 18:07:41.745406 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 18:07:41.745992 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 18:07:41.812575 update_engine[2765]: I20250527 18:07:41.804711 2765 main.cc:92] Flatcar Update Engine starting May 27 18:07:41.746559 systemd[1]: Starting update-engine.service - Update Engine... May 27 18:07:41.812823 jq[2766]: true May 27 18:07:41.754005 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 18:07:41.763519 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 18:07:41.770968 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 18:07:41.771156 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 18:07:41.771442 systemd[1]: motdgen.service: Deactivated successfully. May 27 18:07:41.771609 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 18:07:41.780375 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 18:07:41.780570 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 18:07:41.787036 systemd-logind[2754]: Watching system buttons on /dev/input/event0 (Power Button) May 27 18:07:41.788526 systemd-logind[2754]: New seat seat0. May 27 18:07:41.819201 systemd[1]: Started systemd-logind.service - User Login Management. May 27 18:07:41.823347 sshd_keygen[2759]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 18:07:41.827162 jq[2771]: true May 27 18:07:41.828193 (ntainerd)[2775]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 18:07:41.835618 tar[2768]: linux-arm64/helm May 27 18:07:41.843352 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 18:07:41.843528 dbus-daemon[2730]: [system] SELinux support is enabled May 27 18:07:41.847518 update_engine[2765]: I20250527 18:07:41.847481 2765 update_check_scheduler.cc:74] Next update check in 6m38s May 27 18:07:41.847869 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 18:07:41.858074 bash[2807]: Updated "/home/core/.ssh/authorized_keys" May 27 18:07:41.859350 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 18:07:41.865020 dbus-daemon[2730]: [system] Successfully activated service 'org.freedesktop.systemd1' May 27 18:07:41.866256 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 18:07:41.872514 systemd[1]: Starting sshkeys.service... May 27 18:07:41.877325 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 18:07:41.877352 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 18:07:41.882496 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 18:07:41.882515 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 18:07:41.887436 systemd[1]: issuegen.service: Deactivated successfully. May 27 18:07:41.887626 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 18:07:41.896171 systemd[1]: Started update-engine.service - Update Engine. May 27 18:07:41.903487 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 27 18:07:41.909227 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 27 18:07:41.915287 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 18:07:41.921442 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 18:07:41.929714 coreos-metadata[2819]: May 27 18:07:41.929 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 May 27 18:07:41.929744 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 18:07:41.931091 coreos-metadata[2819]: May 27 18:07:41.931 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) May 27 18:07:41.936095 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 18:07:41.942062 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. May 27 18:07:41.946937 systemd[1]: Reached target getty.target - Login Prompts. May 27 18:07:41.957743 locksmithd[2828]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 18:07:41.993306 containerd[2775]: time="2025-05-27T18:07:41Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 18:07:41.994418 containerd[2775]: time="2025-05-27T18:07:41.994392240Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 18:07:42.001770 containerd[2775]: time="2025-05-27T18:07:42.001747600Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.24µs" May 27 18:07:42.001801 containerd[2775]: time="2025-05-27T18:07:42.001772640Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 18:07:42.001801 containerd[2775]: time="2025-05-27T18:07:42.001793000Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 18:07:42.001941 containerd[2775]: time="2025-05-27T18:07:42.001929320Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 18:07:42.001964 containerd[2775]: time="2025-05-27T18:07:42.001948520Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 18:07:42.001984 containerd[2775]: time="2025-05-27T18:07:42.001969360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 18:07:42.002029 containerd[2775]: time="2025-05-27T18:07:42.002016880Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 18:07:42.002047 containerd[2775]: time="2025-05-27T18:07:42.002028520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 18:07:42.002231 containerd[2775]: time="2025-05-27T18:07:42.002215880Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 18:07:42.002255 containerd[2775]: time="2025-05-27T18:07:42.002231720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 18:07:42.002255 containerd[2775]: time="2025-05-27T18:07:42.002242680Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 18:07:42.002255 containerd[2775]: time="2025-05-27T18:07:42.002250040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 18:07:42.002334 containerd[2775]: time="2025-05-27T18:07:42.002323400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 18:07:42.002513 containerd[2775]: time="2025-05-27T18:07:42.002499520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 18:07:42.002538 containerd[2775]: time="2025-05-27T18:07:42.002528280Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 18:07:42.002560 containerd[2775]: time="2025-05-27T18:07:42.002538400Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 18:07:42.002577 containerd[2775]: time="2025-05-27T18:07:42.002564000Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 18:07:42.003369 containerd[2775]: time="2025-05-27T18:07:42.003345920Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 18:07:42.003472 containerd[2775]: time="2025-05-27T18:07:42.003460200Z" level=info msg="metadata content store policy set" policy=shared May 27 18:07:42.009919 containerd[2775]: time="2025-05-27T18:07:42.009884360Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 18:07:42.009959 containerd[2775]: time="2025-05-27T18:07:42.009938360Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 18:07:42.009959 containerd[2775]: time="2025-05-27T18:07:42.009952680Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 18:07:42.010029 containerd[2775]: time="2025-05-27T18:07:42.009964680Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 18:07:42.010029 containerd[2775]: time="2025-05-27T18:07:42.009977680Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 18:07:42.010029 containerd[2775]: time="2025-05-27T18:07:42.009989320Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 18:07:42.010029 containerd[2775]: time="2025-05-27T18:07:42.010000000Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 18:07:42.010029 containerd[2775]: time="2025-05-27T18:07:42.010011200Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 18:07:42.010029 containerd[2775]: time="2025-05-27T18:07:42.010022200Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 18:07:42.010029 containerd[2775]: time="2025-05-27T18:07:42.010032440Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 18:07:42.010187 containerd[2775]: time="2025-05-27T18:07:42.010042760Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 18:07:42.010187 containerd[2775]: time="2025-05-27T18:07:42.010054800Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 18:07:42.010187 containerd[2775]: time="2025-05-27T18:07:42.010163040Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 18:07:42.010187 containerd[2775]: time="2025-05-27T18:07:42.010182000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 18:07:42.010249 containerd[2775]: time="2025-05-27T18:07:42.010195760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 18:07:42.010249 containerd[2775]: time="2025-05-27T18:07:42.010206720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 18:07:42.010249 containerd[2775]: time="2025-05-27T18:07:42.010216600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 18:07:42.010249 containerd[2775]: time="2025-05-27T18:07:42.010226760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 18:07:42.010249 containerd[2775]: time="2025-05-27T18:07:42.010238840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 18:07:42.010331 containerd[2775]: time="2025-05-27T18:07:42.010256000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 18:07:42.010331 containerd[2775]: time="2025-05-27T18:07:42.010270960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 18:07:42.010331 containerd[2775]: time="2025-05-27T18:07:42.010281280Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 18:07:42.010331 containerd[2775]: time="2025-05-27T18:07:42.010290560Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 18:07:42.010488 containerd[2775]: time="2025-05-27T18:07:42.010472800Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 18:07:42.010510 containerd[2775]: time="2025-05-27T18:07:42.010489200Z" level=info msg="Start snapshots syncer" May 27 18:07:42.010528 containerd[2775]: time="2025-05-27T18:07:42.010511600Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 18:07:42.010737 containerd[2775]: time="2025-05-27T18:07:42.010706200Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 18:07:42.010815 containerd[2775]: time="2025-05-27T18:07:42.010752920Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 18:07:42.010843 containerd[2775]: time="2025-05-27T18:07:42.010820160Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 18:07:42.010936 containerd[2775]: time="2025-05-27T18:07:42.010920000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 18:07:42.010962 containerd[2775]: time="2025-05-27T18:07:42.010944360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 18:07:42.010962 containerd[2775]: time="2025-05-27T18:07:42.010955680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 18:07:42.010997 containerd[2775]: time="2025-05-27T18:07:42.010965360Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 18:07:42.010997 containerd[2775]: time="2025-05-27T18:07:42.010977080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 18:07:42.010997 containerd[2775]: time="2025-05-27T18:07:42.010986920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 18:07:42.010997 containerd[2775]: time="2025-05-27T18:07:42.010996040Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 18:07:42.011064 containerd[2775]: time="2025-05-27T18:07:42.011023400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 18:07:42.011064 containerd[2775]: time="2025-05-27T18:07:42.011035000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 18:07:42.011064 containerd[2775]: time="2025-05-27T18:07:42.011046680Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 18:07:42.011110 containerd[2775]: time="2025-05-27T18:07:42.011074760Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 18:07:42.011110 containerd[2775]: time="2025-05-27T18:07:42.011087000Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 18:07:42.011110 containerd[2775]: time="2025-05-27T18:07:42.011095120Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 18:07:42.011110 containerd[2775]: time="2025-05-27T18:07:42.011104440Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 18:07:42.011175 containerd[2775]: time="2025-05-27T18:07:42.011112120Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 18:07:42.011175 containerd[2775]: time="2025-05-27T18:07:42.011123240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 18:07:42.011175 containerd[2775]: time="2025-05-27T18:07:42.011132920Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 18:07:42.011221 containerd[2775]: time="2025-05-27T18:07:42.011210800Z" level=info msg="runtime interface created" May 27 18:07:42.011221 containerd[2775]: time="2025-05-27T18:07:42.011216320Z" level=info msg="created NRI interface" May 27 18:07:42.011256 containerd[2775]: time="2025-05-27T18:07:42.011224960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 18:07:42.011256 containerd[2775]: time="2025-05-27T18:07:42.011236400Z" level=info msg="Connect containerd service" May 27 18:07:42.011288 containerd[2775]: time="2025-05-27T18:07:42.011267240Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 18:07:42.011878 containerd[2775]: time="2025-05-27T18:07:42.011856000Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 18:07:42.093819 containerd[2775]: time="2025-05-27T18:07:42.093771600Z" level=info msg="Start subscribing containerd event" May 27 18:07:42.093859 containerd[2775]: time="2025-05-27T18:07:42.093847800Z" level=info msg="Start recovering state" May 27 18:07:42.093934 containerd[2775]: time="2025-05-27T18:07:42.093924480Z" level=info msg="Start event monitor" May 27 18:07:42.093954 containerd[2775]: time="2025-05-27T18:07:42.093941480Z" level=info msg="Start cni network conf syncer for default" May 27 18:07:42.093954 containerd[2775]: time="2025-05-27T18:07:42.093950720Z" level=info msg="Start streaming server" May 27 18:07:42.093985 containerd[2775]: time="2025-05-27T18:07:42.093958440Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 18:07:42.093985 containerd[2775]: time="2025-05-27T18:07:42.093964960Z" level=info msg="runtime interface starting up..." May 27 18:07:42.093985 containerd[2775]: time="2025-05-27T18:07:42.093971360Z" level=info msg="starting plugins..." May 27 18:07:42.093985 containerd[2775]: time="2025-05-27T18:07:42.093983720Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 18:07:42.094063 containerd[2775]: time="2025-05-27T18:07:42.094039360Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 18:07:42.094101 containerd[2775]: time="2025-05-27T18:07:42.094092520Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 18:07:42.094150 containerd[2775]: time="2025-05-27T18:07:42.094141080Z" level=info msg="containerd successfully booted in 0.101184s" May 27 18:07:42.094217 systemd[1]: Started containerd.service - containerd container runtime. May 27 18:07:42.146498 tar[2768]: linux-arm64/LICENSE May 27 18:07:42.146552 tar[2768]: linux-arm64/README.md May 27 18:07:42.175570 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 18:07:42.301596 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 233815889 May 27 18:07:42.317868 extend-filesystems[2756]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required May 27 18:07:42.317868 extend-filesystems[2756]: old_desc_blocks = 1, new_desc_blocks = 112 May 27 18:07:42.317868 extend-filesystems[2756]: The filesystem on /dev/nvme0n1p9 is now 233815889 (4k) blocks long. May 27 18:07:42.346229 extend-filesystems[2736]: Resized filesystem in /dev/nvme0n1p9 May 27 18:07:42.320387 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 18:07:42.320713 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 18:07:42.332982 systemd[1]: extend-filesystems.service: Consumed 220ms CPU time, 69.1M memory peak. May 27 18:07:42.664001 coreos-metadata[2729]: May 27 18:07:42.663 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 May 27 18:07:42.664518 coreos-metadata[2729]: May 27 18:07:42.664 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) May 27 18:07:42.890599 kernel: mlx5_core 0001:01:00.0 enP1p1s0f0np0: Link up May 27 18:07:42.907591 kernel: bond0: (slave enP1p1s0f0np0): Enslaving as a backup interface with an up link May 27 18:07:42.908625 systemd-networkd[2686]: enP1p1s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:49:d0:ed.network. May 27 18:07:42.931241 coreos-metadata[2819]: May 27 18:07:42.931 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 May 27 18:07:42.931577 coreos-metadata[2819]: May 27 18:07:42.931 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) May 27 18:07:43.533596 kernel: mlx5_core 0001:01:00.1 enP1p1s0f1np1: Link up May 27 18:07:43.550524 systemd-networkd[2686]: bond0: Configuring with /etc/systemd/network/05-bond0.network. May 27 18:07:43.550604 kernel: bond0: (slave enP1p1s0f1np1): Enslaving as a backup interface with an up link May 27 18:07:43.551754 systemd-networkd[2686]: enP1p1s0f0np0: Link UP May 27 18:07:43.552015 systemd-networkd[2686]: enP1p1s0f0np0: Gained carrier May 27 18:07:43.552043 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 18:07:43.570587 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond May 27 18:07:43.590037 systemd-networkd[2686]: enP1p1s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:49:d0:ec.network. May 27 18:07:43.590336 systemd-networkd[2686]: enP1p1s0f1np1: Link UP May 27 18:07:43.590566 systemd-networkd[2686]: enP1p1s0f1np1: Gained carrier May 27 18:07:43.610865 systemd-networkd[2686]: bond0: Link UP May 27 18:07:43.611149 systemd-networkd[2686]: bond0: Gained carrier May 27 18:07:43.611325 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:07:43.611952 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:07:43.612229 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:07:43.612374 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:07:43.677221 kernel: bond0: (slave enP1p1s0f0np0): link status definitely up, 25000 Mbps full duplex May 27 18:07:43.677257 kernel: bond0: active interface up! May 27 18:07:43.800593 kernel: bond0: (slave enP1p1s0f1np1): link status definitely up, 25000 Mbps full duplex May 27 18:07:44.664608 coreos-metadata[2729]: May 27 18:07:44.664 INFO Fetching https://metadata.packet.net/metadata: Attempt #3 May 27 18:07:44.881659 systemd-networkd[2686]: bond0: Gained IPv6LL May 27 18:07:44.882049 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:07:44.931712 coreos-metadata[2819]: May 27 18:07:44.931 INFO Fetching https://metadata.packet.net/metadata: Attempt #3 May 27 18:07:45.073941 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:07:45.074063 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:07:45.075881 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 18:07:45.081863 systemd[1]: Reached target network-online.target - Network is Online. May 27 18:07:45.089050 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:07:45.116056 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 18:07:45.138697 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 18:07:45.740399 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:07:45.746654 (kubelet)[2895]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 18:07:46.127952 kubelet[2895]: E0527 18:07:46.127858 2895 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 18:07:46.130485 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 18:07:46.130635 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 18:07:46.130979 systemd[1]: kubelet.service: Consumed 744ms CPU time, 260.1M memory peak. May 27 18:07:46.995673 login[2836]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying May 27 18:07:46.995918 login[2835]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) May 27 18:07:47.005814 systemd-logind[2754]: New session 2 of user core. May 27 18:07:47.007089 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 18:07:47.008348 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 18:07:47.019980 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 18:07:47.022414 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 18:07:47.027990 (systemd)[2927]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 18:07:47.029850 systemd-logind[2754]: New session c1 of user core. May 27 18:07:47.106447 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 18:07:47.107708 systemd[1]: Started sshd@0-147.28.228.207:22-139.178.89.65:48962.service - OpenSSH per-connection server daemon (139.178.89.65:48962). May 27 18:07:47.146795 systemd[2927]: Queued start job for default target default.target. May 27 18:07:47.165768 systemd[2927]: Created slice app.slice - User Application Slice. May 27 18:07:47.165793 systemd[2927]: Reached target paths.target - Paths. May 27 18:07:47.165829 systemd[2927]: Reached target timers.target - Timers. May 27 18:07:47.167062 systemd[2927]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 18:07:47.175275 systemd[2927]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 18:07:47.175328 systemd[2927]: Reached target sockets.target - Sockets. May 27 18:07:47.175368 systemd[2927]: Reached target basic.target - Basic System. May 27 18:07:47.175395 systemd[2927]: Reached target default.target - Main User Target. May 27 18:07:47.175416 systemd[2927]: Startup finished in 141ms. May 27 18:07:47.175737 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 18:07:47.177319 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 18:07:47.468802 kernel: mlx5_core 0001:01:00.0: lag map: port 1:1 port 2:2 May 27 18:07:47.469070 kernel: mlx5_core 0001:01:00.0: shared_fdb:0 mode:queue_affinity May 27 18:07:47.532131 sshd[2934]: Accepted publickey for core from 139.178.89.65 port 48962 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:07:47.533496 sshd-session[2934]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:07:47.536496 systemd-logind[2754]: New session 3 of user core. May 27 18:07:47.557720 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 18:07:47.901454 systemd[1]: Started sshd@1-147.28.228.207:22-139.178.89.65:48974.service - OpenSSH per-connection server daemon (139.178.89.65:48974). May 27 18:07:47.997194 login[2836]: pam_unix(login:session): session opened for user core(uid=500) by core(uid=0) May 27 18:07:48.000152 systemd-logind[2754]: New session 1 of user core. May 27 18:07:48.024805 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 18:07:48.309037 sshd[2955]: Accepted publickey for core from 139.178.89.65 port 48974 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:07:48.310193 sshd-session[2955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:07:48.313015 systemd-logind[2754]: New session 4 of user core. May 27 18:07:48.325793 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 18:07:48.605773 sshd[2966]: Connection closed by 139.178.89.65 port 48974 May 27 18:07:48.606215 sshd-session[2955]: pam_unix(sshd:session): session closed for user core May 27 18:07:48.609876 systemd[1]: sshd@1-147.28.228.207:22-139.178.89.65:48974.service: Deactivated successfully. May 27 18:07:48.611468 systemd[1]: session-4.scope: Deactivated successfully. May 27 18:07:48.612069 systemd-logind[2754]: Session 4 logged out. Waiting for processes to exit. May 27 18:07:48.612911 systemd-logind[2754]: Removed session 4. May 27 18:07:48.685496 systemd[1]: Started sshd@2-147.28.228.207:22-139.178.89.65:48990.service - OpenSSH per-connection server daemon (139.178.89.65:48990). May 27 18:07:49.101777 sshd[2972]: Accepted publickey for core from 139.178.89.65 port 48990 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:07:49.102981 sshd-session[2972]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:07:49.106083 systemd-logind[2754]: New session 5 of user core. May 27 18:07:49.127734 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 18:07:49.402069 sshd[2974]: Connection closed by 139.178.89.65 port 48990 May 27 18:07:49.402334 sshd-session[2972]: pam_unix(sshd:session): session closed for user core May 27 18:07:49.405448 systemd[1]: sshd@2-147.28.228.207:22-139.178.89.65:48990.service: Deactivated successfully. May 27 18:07:49.408111 systemd[1]: session-5.scope: Deactivated successfully. May 27 18:07:49.408688 systemd-logind[2754]: Session 5 logged out. Waiting for processes to exit. May 27 18:07:49.409534 systemd-logind[2754]: Removed session 5. May 27 18:07:49.970063 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:07:50.428837 coreos-metadata[2729]: May 27 18:07:50.428 INFO Fetch successful May 27 18:07:50.440379 coreos-metadata[2819]: May 27 18:07:50.440 INFO Fetch successful May 27 18:07:50.487448 unknown[2819]: wrote ssh authorized keys file for user: core May 27 18:07:50.510759 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 18:07:50.513847 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... May 27 18:07:50.526647 update-ssh-keys[2983]: Updated "/home/core/.ssh/authorized_keys" May 27 18:07:50.528664 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 27 18:07:50.530233 systemd[1]: Finished sshkeys.service. May 27 18:07:50.849656 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. May 27 18:07:50.850110 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 18:07:50.854632 systemd[1]: Startup finished in 5.001s (kernel) + 19.521s (initrd) + 13.034s (userspace) = 37.557s. May 27 18:07:56.381903 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 18:07:56.383891 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:07:56.524191 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:07:56.527399 (kubelet)[3001]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 18:07:56.564839 kubelet[3001]: E0527 18:07:56.564800 3001 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 18:07:56.567897 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 18:07:56.568020 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 18:07:56.568303 systemd[1]: kubelet.service: Consumed 156ms CPU time, 115.9M memory peak. May 27 18:07:59.481227 systemd[1]: Started sshd@3-147.28.228.207:22-139.178.89.65:45654.service - OpenSSH per-connection server daemon (139.178.89.65:45654). May 27 18:07:59.911954 sshd[3025]: Accepted publickey for core from 139.178.89.65 port 45654 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:07:59.913057 sshd-session[3025]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:07:59.916076 systemd-logind[2754]: New session 6 of user core. May 27 18:07:59.937696 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 18:08:00.212132 sshd[3027]: Connection closed by 139.178.89.65 port 45654 May 27 18:08:00.212572 sshd-session[3025]: pam_unix(sshd:session): session closed for user core May 27 18:08:00.216373 systemd[1]: sshd@3-147.28.228.207:22-139.178.89.65:45654.service: Deactivated successfully. May 27 18:08:00.218064 systemd[1]: session-6.scope: Deactivated successfully. May 27 18:08:00.218629 systemd-logind[2754]: Session 6 logged out. Waiting for processes to exit. May 27 18:08:00.219496 systemd-logind[2754]: Removed session 6. May 27 18:08:00.291051 systemd[1]: Started sshd@4-147.28.228.207:22-139.178.89.65:45668.service - OpenSSH per-connection server daemon (139.178.89.65:45668). May 27 18:08:00.719553 sshd[3033]: Accepted publickey for core from 139.178.89.65 port 45668 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:08:00.720677 sshd-session[3033]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:08:00.723569 systemd-logind[2754]: New session 7 of user core. May 27 18:08:00.746686 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 18:08:01.016692 sshd[3035]: Connection closed by 139.178.89.65 port 45668 May 27 18:08:01.017085 sshd-session[3033]: pam_unix(sshd:session): session closed for user core May 27 18:08:01.020522 systemd[1]: sshd@4-147.28.228.207:22-139.178.89.65:45668.service: Deactivated successfully. May 27 18:08:01.022234 systemd[1]: session-7.scope: Deactivated successfully. May 27 18:08:01.022798 systemd-logind[2754]: Session 7 logged out. Waiting for processes to exit. May 27 18:08:01.023661 systemd-logind[2754]: Removed session 7. May 27 18:08:01.097149 systemd[1]: Started sshd@5-147.28.228.207:22-139.178.89.65:45672.service - OpenSSH per-connection server daemon (139.178.89.65:45672). May 27 18:08:01.522864 sshd[3042]: Accepted publickey for core from 139.178.89.65 port 45672 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:08:01.523968 sshd-session[3042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:08:01.526803 systemd-logind[2754]: New session 8 of user core. May 27 18:08:01.549738 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 18:08:01.823096 sshd[3044]: Connection closed by 139.178.89.65 port 45672 May 27 18:08:01.823493 sshd-session[3042]: pam_unix(sshd:session): session closed for user core May 27 18:08:01.827101 systemd[1]: sshd@5-147.28.228.207:22-139.178.89.65:45672.service: Deactivated successfully. May 27 18:08:01.828733 systemd[1]: session-8.scope: Deactivated successfully. May 27 18:08:01.829284 systemd-logind[2754]: Session 8 logged out. Waiting for processes to exit. May 27 18:08:01.830145 systemd-logind[2754]: Removed session 8. May 27 18:08:01.904102 systemd[1]: Started sshd@6-147.28.228.207:22-139.178.89.65:45676.service - OpenSSH per-connection server daemon (139.178.89.65:45676). May 27 18:08:02.330662 sshd[3050]: Accepted publickey for core from 139.178.89.65 port 45676 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:08:02.331784 sshd-session[3050]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:08:02.334728 systemd-logind[2754]: New session 9 of user core. May 27 18:08:02.355687 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 18:08:02.572049 sudo[3053]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 18:08:02.572290 sudo[3053]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 18:08:02.597040 sudo[3053]: pam_unix(sudo:session): session closed for user root May 27 18:08:02.664552 sshd[3052]: Connection closed by 139.178.89.65 port 45676 May 27 18:08:02.664996 sshd-session[3050]: pam_unix(sshd:session): session closed for user core May 27 18:08:02.668593 systemd[1]: sshd@6-147.28.228.207:22-139.178.89.65:45676.service: Deactivated successfully. May 27 18:08:02.670750 systemd[1]: session-9.scope: Deactivated successfully. May 27 18:08:02.671312 systemd-logind[2754]: Session 9 logged out. Waiting for processes to exit. May 27 18:08:02.672141 systemd-logind[2754]: Removed session 9. May 27 18:08:02.733327 systemd[1]: Started sshd@7-147.28.228.207:22-139.178.89.65:45684.service - OpenSSH per-connection server daemon (139.178.89.65:45684). May 27 18:08:03.152480 sshd[3059]: Accepted publickey for core from 139.178.89.65 port 45684 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:08:03.153683 sshd-session[3059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:08:03.156606 systemd-logind[2754]: New session 10 of user core. May 27 18:08:03.178748 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 18:08:03.384386 sudo[3063]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 18:08:03.384658 sudo[3063]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 18:08:03.387577 sudo[3063]: pam_unix(sudo:session): session closed for user root May 27 18:08:03.391784 sudo[3062]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 18:08:03.392021 sudo[3062]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 18:08:03.398847 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 18:08:03.444232 augenrules[3085]: No rules May 27 18:08:03.445296 systemd[1]: audit-rules.service: Deactivated successfully. May 27 18:08:03.446622 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 18:08:03.447399 sudo[3062]: pam_unix(sudo:session): session closed for user root May 27 18:08:03.509332 sshd[3061]: Connection closed by 139.178.89.65 port 45684 May 27 18:08:03.509690 sshd-session[3059]: pam_unix(sshd:session): session closed for user core May 27 18:08:03.512604 systemd[1]: sshd@7-147.28.228.207:22-139.178.89.65:45684.service: Deactivated successfully. May 27 18:08:03.515058 systemd[1]: session-10.scope: Deactivated successfully. May 27 18:08:03.515625 systemd-logind[2754]: Session 10 logged out. Waiting for processes to exit. May 27 18:08:03.516953 systemd-logind[2754]: Removed session 10. May 27 18:08:03.589275 systemd[1]: Started sshd@8-147.28.228.207:22-139.178.89.65:32854.service - OpenSSH per-connection server daemon (139.178.89.65:32854). May 27 18:08:04.006440 sshd[3094]: Accepted publickey for core from 139.178.89.65 port 32854 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:08:04.007587 sshd-session[3094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:08:04.010445 systemd-logind[2754]: New session 11 of user core. May 27 18:08:04.031697 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 18:08:04.238125 sudo[3097]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 18:08:04.238382 sudo[3097]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 18:08:04.544161 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 18:08:04.566872 (dockerd)[3126]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 18:08:04.786250 dockerd[3126]: time="2025-05-27T18:08:04.786203120Z" level=info msg="Starting up" May 27 18:08:04.787460 dockerd[3126]: time="2025-05-27T18:08:04.787436880Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 18:08:04.823806 dockerd[3126]: time="2025-05-27T18:08:04.823733160Z" level=info msg="Loading containers: start." May 27 18:08:04.835594 kernel: Initializing XFRM netlink socket May 27 18:08:05.008423 systemd-timesyncd[2688]: Network configuration changed, trying to establish connection. May 27 18:08:05.038705 systemd-networkd[2686]: docker0: Link UP May 27 18:08:05.039502 dockerd[3126]: time="2025-05-27T18:08:05.039473240Z" level=info msg="Loading containers: done." May 27 18:08:05.048354 dockerd[3126]: time="2025-05-27T18:08:05.048327800Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 18:08:05.048422 dockerd[3126]: time="2025-05-27T18:08:05.048390440Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 18:08:05.048494 dockerd[3126]: time="2025-05-27T18:08:05.048482560Z" level=info msg="Initializing buildkit" May 27 18:08:05.049981 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck911068425-merged.mount: Deactivated successfully. May 27 18:08:05.062364 dockerd[3126]: time="2025-05-27T18:08:05.062338520Z" level=info msg="Completed buildkit initialization" May 27 18:08:05.068010 dockerd[3126]: time="2025-05-27T18:08:05.067970000Z" level=info msg="Daemon has completed initialization" May 27 18:08:05.068070 dockerd[3126]: time="2025-05-27T18:08:05.068025400Z" level=info msg="API listen on /run/docker.sock" May 27 18:08:05.068158 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 18:08:05.254376 systemd-timesyncd[2688]: Contacted time server [2605:6400:40:fec3:5a00:b1ba:9a51:c93b]:123 (2.flatcar.pool.ntp.org). May 27 18:08:05.254431 systemd-timesyncd[2688]: Initial clock synchronization to Tue 2025-05-27 18:08:05.028648 UTC. May 27 18:08:05.613796 containerd[2775]: time="2025-05-27T18:08:05.613733360Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.9\"" May 27 18:08:06.183538 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2989871863.mount: Deactivated successfully. May 27 18:08:06.818361 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 18:08:06.819769 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:08:06.959873 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:08:06.963028 (kubelet)[3451]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 18:08:07.006458 kubelet[3451]: E0527 18:08:07.006425 3451 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 18:08:07.009018 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 18:08:07.009140 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 18:08:07.010648 systemd[1]: kubelet.service: Consumed 139ms CPU time, 114.9M memory peak. May 27 18:08:07.989722 containerd[2775]: time="2025-05-27T18:08:07.989670116Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.9: active requests=0, bytes read=25651974" May 27 18:08:07.989722 containerd[2775]: time="2025-05-27T18:08:07.989676007Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:07.990680 containerd[2775]: time="2025-05-27T18:08:07.990659457Z" level=info msg="ImageCreate event name:\"sha256:90d52158b7646075e7e560c1bd670904ba3f4f4c8c199106bf96ee0944663d61\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:07.992976 containerd[2775]: time="2025-05-27T18:08:07.992952979Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5b68f0df22013422dc8fb9ddfcff513eb6fc92f9dbf8aae41555c895efef5a20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:07.993908 containerd[2775]: time="2025-05-27T18:08:07.993881655Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.9\" with image id \"sha256:90d52158b7646075e7e560c1bd670904ba3f4f4c8c199106bf96ee0944663d61\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5b68f0df22013422dc8fb9ddfcff513eb6fc92f9dbf8aae41555c895efef5a20\", size \"25648774\" in 2.380114169s" May 27 18:08:07.993936 containerd[2775]: time="2025-05-27T18:08:07.993919185Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.9\" returns image reference \"sha256:90d52158b7646075e7e560c1bd670904ba3f4f4c8c199106bf96ee0944663d61\"" May 27 18:08:07.995054 containerd[2775]: time="2025-05-27T18:08:07.995033370Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.9\"" May 27 18:08:09.819256 containerd[2775]: time="2025-05-27T18:08:09.819220326Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:09.819508 containerd[2775]: time="2025-05-27T18:08:09.819294185Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.9: active requests=0, bytes read=22459528" May 27 18:08:09.820191 containerd[2775]: time="2025-05-27T18:08:09.820164474Z" level=info msg="ImageCreate event name:\"sha256:2d03fe540daca1d9520c403342787715eab3b05fb6773ea41153572716c82dba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:09.822390 containerd[2775]: time="2025-05-27T18:08:09.822364216Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:be9e7987d323b38a12e28436cff6d6ec6fc31ffdd3ea11eaa9d74852e9d31248\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:09.823306 containerd[2775]: time="2025-05-27T18:08:09.823280893Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.9\" with image id \"sha256:2d03fe540daca1d9520c403342787715eab3b05fb6773ea41153572716c82dba\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:be9e7987d323b38a12e28436cff6d6ec6fc31ffdd3ea11eaa9d74852e9d31248\", size \"23995294\" in 1.828217281s" May 27 18:08:09.823332 containerd[2775]: time="2025-05-27T18:08:09.823313544Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.9\" returns image reference \"sha256:2d03fe540daca1d9520c403342787715eab3b05fb6773ea41153572716c82dba\"" May 27 18:08:09.823692 containerd[2775]: time="2025-05-27T18:08:09.823675105Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.9\"" May 27 18:08:11.276603 containerd[2775]: time="2025-05-27T18:08:11.276541012Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:11.276979 containerd[2775]: time="2025-05-27T18:08:11.276547596Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.9: active requests=0, bytes read=17125279" May 27 18:08:11.277526 containerd[2775]: time="2025-05-27T18:08:11.277498999Z" level=info msg="ImageCreate event name:\"sha256:b333fec06af219faaf48f1784baa0b7274945b2e5be5bd2fca2681f7d1baff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:11.279766 containerd[2775]: time="2025-05-27T18:08:11.279741857Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:eb358c7346bb17ab2c639c3ff8ab76a147dec7ae609f5c0c2800233e42253ed1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:11.280729 containerd[2775]: time="2025-05-27T18:08:11.280708910Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.9\" with image id \"sha256:b333fec06af219faaf48f1784baa0b7274945b2e5be5bd2fca2681f7d1baff5f\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:eb358c7346bb17ab2c639c3ff8ab76a147dec7ae609f5c0c2800233e42253ed1\", size \"18661063\" in 1.457004878s" May 27 18:08:11.280762 containerd[2775]: time="2025-05-27T18:08:11.280734928Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.9\" returns image reference \"sha256:b333fec06af219faaf48f1784baa0b7274945b2e5be5bd2fca2681f7d1baff5f\"" May 27 18:08:11.281049 containerd[2775]: time="2025-05-27T18:08:11.281030788Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.9\"" May 27 18:08:11.894780 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1519849946.mount: Deactivated successfully. May 27 18:08:12.439690 containerd[2775]: time="2025-05-27T18:08:12.439656751Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:12.439973 containerd[2775]: time="2025-05-27T18:08:12.439642296Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.9: active requests=0, bytes read=26871375" May 27 18:08:12.440247 containerd[2775]: time="2025-05-27T18:08:12.440225662Z" level=info msg="ImageCreate event name:\"sha256:cbfba5e6542fe387b24d9e73bf5a054a6b07b95af1392268fd82b6f449ef1c27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:12.441702 containerd[2775]: time="2025-05-27T18:08:12.441680778Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:fdf026cf2434537e499e9c739d189ca8fc57101d929ac5ccd8e24f979a9738c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:12.442259 containerd[2775]: time="2025-05-27T18:08:12.442234404Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.9\" with image id \"sha256:cbfba5e6542fe387b24d9e73bf5a054a6b07b95af1392268fd82b6f449ef1c27\", repo tag \"registry.k8s.io/kube-proxy:v1.31.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:fdf026cf2434537e499e9c739d189ca8fc57101d929ac5ccd8e24f979a9738c1\", size \"26870394\" in 1.161172576s" May 27 18:08:12.442283 containerd[2775]: time="2025-05-27T18:08:12.442267382Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.9\" returns image reference \"sha256:cbfba5e6542fe387b24d9e73bf5a054a6b07b95af1392268fd82b6f449ef1c27\"" May 27 18:08:12.442620 containerd[2775]: time="2025-05-27T18:08:12.442606161Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 27 18:08:12.769023 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2768042766.mount: Deactivated successfully. May 27 18:08:13.742744 containerd[2775]: time="2025-05-27T18:08:13.742686688Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951622" May 27 18:08:13.742744 containerd[2775]: time="2025-05-27T18:08:13.742688943Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:13.743779 containerd[2775]: time="2025-05-27T18:08:13.743724511Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:13.746216 containerd[2775]: time="2025-05-27T18:08:13.746165588Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:13.747310 containerd[2775]: time="2025-05-27T18:08:13.747202581Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.304571996s" May 27 18:08:13.747310 containerd[2775]: time="2025-05-27T18:08:13.747236798Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" May 27 18:08:13.747847 containerd[2775]: time="2025-05-27T18:08:13.747653138Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 18:08:14.000156 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount244152479.mount: Deactivated successfully. May 27 18:08:14.000505 containerd[2775]: time="2025-05-27T18:08:14.000476266Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 18:08:14.001171 containerd[2775]: time="2025-05-27T18:08:14.001151864Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268703" May 27 18:08:14.001607 containerd[2775]: time="2025-05-27T18:08:14.001589025Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 18:08:14.003660 containerd[2775]: time="2025-05-27T18:08:14.003631648Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 18:08:14.004145 containerd[2775]: time="2025-05-27T18:08:14.004121378Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 256.432227ms" May 27 18:08:14.004173 containerd[2775]: time="2025-05-27T18:08:14.004150850Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 27 18:08:14.004485 containerd[2775]: time="2025-05-27T18:08:14.004466126Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 27 18:08:14.218071 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2467293865.mount: Deactivated successfully. May 27 18:08:17.065131 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 27 18:08:17.066604 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:08:17.210328 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:08:17.213354 (kubelet)[3617]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 18:08:17.243211 kubelet[3617]: E0527 18:08:17.243177 3617 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 18:08:17.245561 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 18:08:17.245702 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 18:08:17.246182 systemd[1]: kubelet.service: Consumed 139ms CPU time, 115.2M memory peak. May 27 18:08:17.371336 containerd[2775]: time="2025-05-27T18:08:17.371233836Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:17.371664 containerd[2775]: time="2025-05-27T18:08:17.371248143Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406465" May 27 18:08:17.372283 containerd[2775]: time="2025-05-27T18:08:17.372258067Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:17.374741 containerd[2775]: time="2025-05-27T18:08:17.374722174Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:17.375827 containerd[2775]: time="2025-05-27T18:08:17.375787536Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 3.371288095s" May 27 18:08:17.375874 containerd[2775]: time="2025-05-27T18:08:17.375836099Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" May 27 18:08:23.312160 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:08:23.312404 systemd[1]: kubelet.service: Consumed 139ms CPU time, 115.2M memory peak. May 27 18:08:23.314702 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:08:23.332217 systemd[1]: Reload requested from client PID 3702 ('systemctl') (unit session-11.scope)... May 27 18:08:23.332227 systemd[1]: Reloading... May 27 18:08:23.392591 zram_generator::config[3749]: No configuration found. May 27 18:08:23.466857 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 18:08:23.569221 systemd[1]: Reloading finished in 236 ms. May 27 18:08:23.628039 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 18:08:23.628118 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 18:08:23.628368 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:08:23.630611 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:08:23.760354 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:08:23.763607 (kubelet)[3809]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 18:08:23.794611 kubelet[3809]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 18:08:23.794611 kubelet[3809]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 27 18:08:23.794611 kubelet[3809]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 18:08:23.794879 kubelet[3809]: I0527 18:08:23.794656 3809 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 18:08:24.554769 kubelet[3809]: I0527 18:08:24.554733 3809 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 27 18:08:24.554769 kubelet[3809]: I0527 18:08:24.554759 3809 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 18:08:24.554982 kubelet[3809]: I0527 18:08:24.554963 3809 server.go:934] "Client rotation is on, will bootstrap in background" May 27 18:08:24.577070 kubelet[3809]: E0527 18:08:24.577046 3809 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://147.28.228.207:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 147.28.228.207:6443: connect: connection refused" logger="UnhandledError" May 27 18:08:24.577992 kubelet[3809]: I0527 18:08:24.577977 3809 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 18:08:24.585648 kubelet[3809]: I0527 18:08:24.585619 3809 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 18:08:24.607588 kubelet[3809]: I0527 18:08:24.607561 3809 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 18:08:24.608345 kubelet[3809]: I0527 18:08:24.608326 3809 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 27 18:08:24.608482 kubelet[3809]: I0527 18:08:24.608457 3809 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 18:08:24.608639 kubelet[3809]: I0527 18:08:24.608484 3809 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-a-e5d745d36c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 18:08:24.608779 kubelet[3809]: I0527 18:08:24.608770 3809 topology_manager.go:138] "Creating topology manager with none policy" May 27 18:08:24.608779 kubelet[3809]: I0527 18:08:24.608780 3809 container_manager_linux.go:300] "Creating device plugin manager" May 27 18:08:24.609014 kubelet[3809]: I0527 18:08:24.609005 3809 state_mem.go:36] "Initialized new in-memory state store" May 27 18:08:24.610928 kubelet[3809]: I0527 18:08:24.610911 3809 kubelet.go:408] "Attempting to sync node with API server" May 27 18:08:24.610951 kubelet[3809]: I0527 18:08:24.610933 3809 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 18:08:24.610974 kubelet[3809]: I0527 18:08:24.610954 3809 kubelet.go:314] "Adding apiserver pod source" May 27 18:08:24.610974 kubelet[3809]: I0527 18:08:24.610965 3809 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 18:08:24.615758 kubelet[3809]: I0527 18:08:24.615741 3809 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 18:08:24.616551 kubelet[3809]: I0527 18:08:24.616538 3809 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 18:08:24.616664 kubelet[3809]: W0527 18:08:24.616652 3809 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 18:08:24.616843 kubelet[3809]: W0527 18:08:24.616735 3809 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://147.28.228.207:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 147.28.228.207:6443: connect: connection refused May 27 18:08:24.616880 kubelet[3809]: E0527 18:08:24.616863 3809 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://147.28.228.207:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 147.28.228.207:6443: connect: connection refused" logger="UnhandledError" May 27 18:08:24.616880 kubelet[3809]: W0527 18:08:24.616741 3809 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://147.28.228.207:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-e5d745d36c&limit=500&resourceVersion=0": dial tcp 147.28.228.207:6443: connect: connection refused May 27 18:08:24.616922 kubelet[3809]: E0527 18:08:24.616895 3809 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://147.28.228.207:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-e5d745d36c&limit=500&resourceVersion=0\": dial tcp 147.28.228.207:6443: connect: connection refused" logger="UnhandledError" May 27 18:08:24.617558 kubelet[3809]: I0527 18:08:24.617544 3809 server.go:1274] "Started kubelet" May 27 18:08:24.617672 kubelet[3809]: I0527 18:08:24.617620 3809 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 18:08:24.617849 kubelet[3809]: I0527 18:08:24.617792 3809 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 27 18:08:24.617917 kubelet[3809]: I0527 18:08:24.617902 3809 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 18:08:24.618790 kubelet[3809]: I0527 18:08:24.618775 3809 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 18:08:24.618833 kubelet[3809]: I0527 18:08:24.618782 3809 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 18:08:24.618863 kubelet[3809]: I0527 18:08:24.618851 3809 volume_manager.go:289] "Starting Kubelet Volume Manager" May 27 18:08:24.618932 kubelet[3809]: I0527 18:08:24.618920 3809 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 27 18:08:24.618975 kubelet[3809]: I0527 18:08:24.618969 3809 reconciler.go:26] "Reconciler: start to sync state" May 27 18:08:24.619245 kubelet[3809]: E0527 18:08:24.618965 3809 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-e5d745d36c\" not found" May 27 18:08:24.619442 kubelet[3809]: E0527 18:08:24.619407 3809 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.228.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-e5d745d36c?timeout=10s\": dial tcp 147.28.228.207:6443: connect: connection refused" interval="200ms" May 27 18:08:24.619475 kubelet[3809]: W0527 18:08:24.619414 3809 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://147.28.228.207:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 147.28.228.207:6443: connect: connection refused May 27 18:08:24.619499 kubelet[3809]: E0527 18:08:24.619469 3809 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://147.28.228.207:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 147.28.228.207:6443: connect: connection refused" logger="UnhandledError" May 27 18:08:24.619651 kubelet[3809]: I0527 18:08:24.619636 3809 factory.go:221] Registration of the systemd container factory successfully May 27 18:08:24.619738 kubelet[3809]: I0527 18:08:24.619722 3809 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 18:08:24.619823 kubelet[3809]: E0527 18:08:24.619729 3809 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 18:08:24.620380 kubelet[3809]: I0527 18:08:24.620365 3809 server.go:449] "Adding debug handlers to kubelet server" May 27 18:08:24.620650 kubelet[3809]: I0527 18:08:24.620634 3809 factory.go:221] Registration of the containerd container factory successfully May 27 18:08:24.622486 kubelet[3809]: E0527 18:08:24.620197 3809 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://147.28.228.207:6443/api/v1/namespaces/default/events\": dial tcp 147.28.228.207:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344.0.0-a-e5d745d36c.1843749e89ec9670 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344.0.0-a-e5d745d36c,UID:ci-4344.0.0-a-e5d745d36c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344.0.0-a-e5d745d36c,},FirstTimestamp:2025-05-27 18:08:24.6175228 +0000 UTC m=+0.851167455,LastTimestamp:2025-05-27 18:08:24.6175228 +0000 UTC m=+0.851167455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344.0.0-a-e5d745d36c,}" May 27 18:08:24.634140 kubelet[3809]: I0527 18:08:24.634105 3809 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 18:08:24.635109 kubelet[3809]: I0527 18:08:24.635096 3809 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 18:08:24.635146 kubelet[3809]: I0527 18:08:24.635113 3809 status_manager.go:217] "Starting to sync pod status with apiserver" May 27 18:08:24.635146 kubelet[3809]: I0527 18:08:24.635128 3809 kubelet.go:2321] "Starting kubelet main sync loop" May 27 18:08:24.635191 kubelet[3809]: I0527 18:08:24.635153 3809 cpu_manager.go:214] "Starting CPU manager" policy="none" May 27 18:08:24.635191 kubelet[3809]: I0527 18:08:24.635167 3809 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 27 18:08:24.635191 kubelet[3809]: E0527 18:08:24.635163 3809 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 18:08:24.635191 kubelet[3809]: I0527 18:08:24.635184 3809 state_mem.go:36] "Initialized new in-memory state store" May 27 18:08:24.635854 kubelet[3809]: I0527 18:08:24.635840 3809 policy_none.go:49] "None policy: Start" May 27 18:08:24.636232 kubelet[3809]: I0527 18:08:24.636221 3809 memory_manager.go:170] "Starting memorymanager" policy="None" May 27 18:08:24.636269 kubelet[3809]: I0527 18:08:24.636243 3809 state_mem.go:35] "Initializing new in-memory state store" May 27 18:08:24.636729 kubelet[3809]: W0527 18:08:24.636685 3809 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://147.28.228.207:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 147.28.228.207:6443: connect: connection refused May 27 18:08:24.636768 kubelet[3809]: E0527 18:08:24.636741 3809 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://147.28.228.207:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 147.28.228.207:6443: connect: connection refused" logger="UnhandledError" May 27 18:08:24.642771 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 18:08:24.661704 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 18:08:24.664263 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 18:08:24.684276 kubelet[3809]: I0527 18:08:24.684256 3809 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 18:08:24.684448 kubelet[3809]: I0527 18:08:24.684430 3809 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 18:08:24.684496 kubelet[3809]: I0527 18:08:24.684442 3809 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 18:08:24.684627 kubelet[3809]: I0527 18:08:24.684612 3809 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 18:08:24.685437 kubelet[3809]: E0527 18:08:24.685418 3809 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344.0.0-a-e5d745d36c\" not found" May 27 18:08:24.742678 systemd[1]: Created slice kubepods-burstable-pod3aa78242bb1927e3b91901392d5918d0.slice - libcontainer container kubepods-burstable-pod3aa78242bb1927e3b91901392d5918d0.slice. May 27 18:08:24.780101 systemd[1]: Created slice kubepods-burstable-pod11d3ac03b7a3ab8910deac3e7e920238.slice - libcontainer container kubepods-burstable-pod11d3ac03b7a3ab8910deac3e7e920238.slice. May 27 18:08:24.785969 kubelet[3809]: I0527 18:08:24.785949 3809 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.786388 kubelet[3809]: E0527 18:08:24.786362 3809 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://147.28.228.207:6443/api/v1/nodes\": dial tcp 147.28.228.207:6443: connect: connection refused" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.804754 systemd[1]: Created slice kubepods-burstable-pod73db487bdeb2025626e402cab091d2c9.slice - libcontainer container kubepods-burstable-pod73db487bdeb2025626e402cab091d2c9.slice. May 27 18:08:24.820207 kubelet[3809]: I0527 18:08:24.820184 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820423 kubelet[3809]: I0527 18:08:24.820216 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/73db487bdeb2025626e402cab091d2c9-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-a-e5d745d36c\" (UID: \"73db487bdeb2025626e402cab091d2c9\") " pod="kube-system/kube-scheduler-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820423 kubelet[3809]: I0527 18:08:24.820235 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3aa78242bb1927e3b91901392d5918d0-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-a-e5d745d36c\" (UID: \"3aa78242bb1927e3b91901392d5918d0\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820423 kubelet[3809]: I0527 18:08:24.820264 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3aa78242bb1927e3b91901392d5918d0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-a-e5d745d36c\" (UID: \"3aa78242bb1927e3b91901392d5918d0\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820423 kubelet[3809]: I0527 18:08:24.820300 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820423 kubelet[3809]: I0527 18:08:24.820325 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820524 kubelet[3809]: I0527 18:08:24.820343 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820524 kubelet[3809]: I0527 18:08:24.820359 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820524 kubelet[3809]: I0527 18:08:24.820374 3809 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3aa78242bb1927e3b91901392d5918d0-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-a-e5d745d36c\" (UID: \"3aa78242bb1927e3b91901392d5918d0\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.820524 kubelet[3809]: E0527 18:08:24.820364 3809 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.228.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-e5d745d36c?timeout=10s\": dial tcp 147.28.228.207:6443: connect: connection refused" interval="400ms" May 27 18:08:24.988893 kubelet[3809]: I0527 18:08:24.988870 3809 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:24.989186 kubelet[3809]: E0527 18:08:24.989160 3809 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://147.28.228.207:6443/api/v1/nodes\": dial tcp 147.28.228.207:6443: connect: connection refused" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:25.079233 containerd[2775]: time="2025-05-27T18:08:25.079155408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-a-e5d745d36c,Uid:3aa78242bb1927e3b91901392d5918d0,Namespace:kube-system,Attempt:0,}" May 27 18:08:25.103130 containerd[2775]: time="2025-05-27T18:08:25.103099744Z" level=info msg="connecting to shim 4f5e3a768ef0028ac87c30eaefb7fb109432fa12ce2058146a7f76cc2910019a" address="unix:///run/containerd/s/6e3e80da8c0b47a27809be51a3c8ad6f841ae97edadb3169a9152cde68c90fee" namespace=k8s.io protocol=ttrpc version=3 May 27 18:08:25.103597 containerd[2775]: time="2025-05-27T18:08:25.103563628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-a-e5d745d36c,Uid:11d3ac03b7a3ab8910deac3e7e920238,Namespace:kube-system,Attempt:0,}" May 27 18:08:25.107052 containerd[2775]: time="2025-05-27T18:08:25.107024893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-a-e5d745d36c,Uid:73db487bdeb2025626e402cab091d2c9,Namespace:kube-system,Attempt:0,}" May 27 18:08:25.112118 containerd[2775]: time="2025-05-27T18:08:25.112090255Z" level=info msg="connecting to shim 30d7a6873f819ac350a1288324f8e5b9c588a272b64775683eed5547a1cabc1d" address="unix:///run/containerd/s/f11cda3e480c2b180d6ea2f991d6eceaa0273c4daf89bbb7ee4ada2bc5eee7b0" namespace=k8s.io protocol=ttrpc version=3 May 27 18:08:25.115463 containerd[2775]: time="2025-05-27T18:08:25.115432546Z" level=info msg="connecting to shim 99511bca5399f5b4d1fe7794402dc419144f242b61708e13be059a454bddfac3" address="unix:///run/containerd/s/9ba2decc4fd1d46d0f5e9a84e1359c355df586bec166840b623063d189f53e45" namespace=k8s.io protocol=ttrpc version=3 May 27 18:08:25.139790 systemd[1]: Started cri-containerd-4f5e3a768ef0028ac87c30eaefb7fb109432fa12ce2058146a7f76cc2910019a.scope - libcontainer container 4f5e3a768ef0028ac87c30eaefb7fb109432fa12ce2058146a7f76cc2910019a. May 27 18:08:25.142901 systemd[1]: Started cri-containerd-30d7a6873f819ac350a1288324f8e5b9c588a272b64775683eed5547a1cabc1d.scope - libcontainer container 30d7a6873f819ac350a1288324f8e5b9c588a272b64775683eed5547a1cabc1d. May 27 18:08:25.144178 systemd[1]: Started cri-containerd-99511bca5399f5b4d1fe7794402dc419144f242b61708e13be059a454bddfac3.scope - libcontainer container 99511bca5399f5b4d1fe7794402dc419144f242b61708e13be059a454bddfac3. May 27 18:08:25.164923 containerd[2775]: time="2025-05-27T18:08:25.164895501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-a-e5d745d36c,Uid:3aa78242bb1927e3b91901392d5918d0,Namespace:kube-system,Attempt:0,} returns sandbox id \"4f5e3a768ef0028ac87c30eaefb7fb109432fa12ce2058146a7f76cc2910019a\"" May 27 18:08:25.167092 containerd[2775]: time="2025-05-27T18:08:25.167068086Z" level=info msg="CreateContainer within sandbox \"4f5e3a768ef0028ac87c30eaefb7fb109432fa12ce2058146a7f76cc2910019a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 18:08:25.167309 containerd[2775]: time="2025-05-27T18:08:25.167289871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-a-e5d745d36c,Uid:11d3ac03b7a3ab8910deac3e7e920238,Namespace:kube-system,Attempt:0,} returns sandbox id \"30d7a6873f819ac350a1288324f8e5b9c588a272b64775683eed5547a1cabc1d\"" May 27 18:08:25.170269 containerd[2775]: time="2025-05-27T18:08:25.170239161Z" level=info msg="CreateContainer within sandbox \"30d7a6873f819ac350a1288324f8e5b9c588a272b64775683eed5547a1cabc1d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 18:08:25.170621 containerd[2775]: time="2025-05-27T18:08:25.170596722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-a-e5d745d36c,Uid:73db487bdeb2025626e402cab091d2c9,Namespace:kube-system,Attempt:0,} returns sandbox id \"99511bca5399f5b4d1fe7794402dc419144f242b61708e13be059a454bddfac3\"" May 27 18:08:25.172512 containerd[2775]: time="2025-05-27T18:08:25.172488055Z" level=info msg="CreateContainer within sandbox \"99511bca5399f5b4d1fe7794402dc419144f242b61708e13be059a454bddfac3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 18:08:25.173356 containerd[2775]: time="2025-05-27T18:08:25.173333806Z" level=info msg="Container f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:25.174205 containerd[2775]: time="2025-05-27T18:08:25.174181831Z" level=info msg="Container aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:25.176165 containerd[2775]: time="2025-05-27T18:08:25.176144884Z" level=info msg="Container 02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:25.177548 containerd[2775]: time="2025-05-27T18:08:25.177529191Z" level=info msg="CreateContainer within sandbox \"4f5e3a768ef0028ac87c30eaefb7fb109432fa12ce2058146a7f76cc2910019a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23\"" May 27 18:08:25.177677 containerd[2775]: time="2025-05-27T18:08:25.177655948Z" level=info msg="CreateContainer within sandbox \"30d7a6873f819ac350a1288324f8e5b9c588a272b64775683eed5547a1cabc1d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42\"" May 27 18:08:25.177932 containerd[2775]: time="2025-05-27T18:08:25.177916965Z" level=info msg="StartContainer for \"aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42\"" May 27 18:08:25.178004 containerd[2775]: time="2025-05-27T18:08:25.177985252Z" level=info msg="StartContainer for \"f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23\"" May 27 18:08:25.178913 containerd[2775]: time="2025-05-27T18:08:25.178891627Z" level=info msg="connecting to shim aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42" address="unix:///run/containerd/s/f11cda3e480c2b180d6ea2f991d6eceaa0273c4daf89bbb7ee4ada2bc5eee7b0" protocol=ttrpc version=3 May 27 18:08:25.178942 containerd[2775]: time="2025-05-27T18:08:25.178918407Z" level=info msg="connecting to shim f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23" address="unix:///run/containerd/s/6e3e80da8c0b47a27809be51a3c8ad6f841ae97edadb3169a9152cde68c90fee" protocol=ttrpc version=3 May 27 18:08:25.178970 containerd[2775]: time="2025-05-27T18:08:25.178905915Z" level=info msg="CreateContainer within sandbox \"99511bca5399f5b4d1fe7794402dc419144f242b61708e13be059a454bddfac3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2\"" May 27 18:08:25.179197 containerd[2775]: time="2025-05-27T18:08:25.179178266Z" level=info msg="StartContainer for \"02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2\"" May 27 18:08:25.180064 containerd[2775]: time="2025-05-27T18:08:25.180043413Z" level=info msg="connecting to shim 02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2" address="unix:///run/containerd/s/9ba2decc4fd1d46d0f5e9a84e1359c355df586bec166840b623063d189f53e45" protocol=ttrpc version=3 May 27 18:08:25.209703 systemd[1]: Started cri-containerd-02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2.scope - libcontainer container 02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2. May 27 18:08:25.210768 systemd[1]: Started cri-containerd-aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42.scope - libcontainer container aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42. May 27 18:08:25.211842 systemd[1]: Started cri-containerd-f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23.scope - libcontainer container f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23. May 27 18:08:25.220797 kubelet[3809]: E0527 18:08:25.220757 3809 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://147.28.228.207:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-e5d745d36c?timeout=10s\": dial tcp 147.28.228.207:6443: connect: connection refused" interval="800ms" May 27 18:08:25.238357 containerd[2775]: time="2025-05-27T18:08:25.238322707Z" level=info msg="StartContainer for \"02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2\" returns successfully" May 27 18:08:25.239353 containerd[2775]: time="2025-05-27T18:08:25.239328660Z" level=info msg="StartContainer for \"f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23\" returns successfully" May 27 18:08:25.241111 containerd[2775]: time="2025-05-27T18:08:25.241088926Z" level=info msg="StartContainer for \"aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42\" returns successfully" May 27 18:08:25.391623 kubelet[3809]: I0527 18:08:25.391525 3809 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:27.214114 kubelet[3809]: E0527 18:08:27.214082 3809 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344.0.0-a-e5d745d36c\" not found" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:27.299614 update_engine[2765]: I20250527 18:08:27.299537 2765 update_attempter.cc:509] Updating boot flags... May 27 18:08:27.326103 kubelet[3809]: I0527 18:08:27.326074 3809 kubelet_node_status.go:75] "Successfully registered node" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:27.613776 kubelet[3809]: I0527 18:08:27.613737 3809 apiserver.go:52] "Watching apiserver" May 27 18:08:27.619058 kubelet[3809]: I0527 18:08:27.619032 3809 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 27 18:08:27.647387 kubelet[3809]: E0527 18:08:27.647358 3809 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4344.0.0-a-e5d745d36c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.195423 systemd[1]: Reload requested from client PID 4249 ('systemctl') (unit session-11.scope)... May 27 18:08:29.195434 systemd[1]: Reloading... May 27 18:08:29.270596 zram_generator::config[4299]: No configuration found. May 27 18:08:29.345129 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 18:08:29.376021 kubelet[3809]: W0527 18:08:29.375999 3809 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 18:08:29.458554 systemd[1]: Reloading finished in 262 ms. May 27 18:08:29.489979 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:08:29.514357 systemd[1]: kubelet.service: Deactivated successfully. May 27 18:08:29.515672 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:08:29.515733 systemd[1]: kubelet.service: Consumed 1.293s CPU time, 144.5M memory peak. May 27 18:08:29.517506 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:08:29.667662 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:08:29.671064 (kubelet)[4358]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 18:08:29.701234 kubelet[4358]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 18:08:29.701234 kubelet[4358]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 27 18:08:29.701234 kubelet[4358]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 18:08:29.701416 kubelet[4358]: I0527 18:08:29.701293 4358 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 18:08:29.706166 kubelet[4358]: I0527 18:08:29.706145 4358 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 27 18:08:29.706203 kubelet[4358]: I0527 18:08:29.706168 4358 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 18:08:29.706395 kubelet[4358]: I0527 18:08:29.706382 4358 server.go:934] "Client rotation is on, will bootstrap in background" May 27 18:08:29.707649 kubelet[4358]: I0527 18:08:29.707636 4358 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 27 18:08:29.709670 kubelet[4358]: I0527 18:08:29.709615 4358 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 18:08:29.712425 kubelet[4358]: I0527 18:08:29.712408 4358 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 18:08:29.731383 kubelet[4358]: I0527 18:08:29.731357 4358 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 18:08:29.731474 kubelet[4358]: I0527 18:08:29.731458 4358 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 27 18:08:29.731583 kubelet[4358]: I0527 18:08:29.731554 4358 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 18:08:29.731747 kubelet[4358]: I0527 18:08:29.731587 4358 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-a-e5d745d36c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 18:08:29.731829 kubelet[4358]: I0527 18:08:29.731751 4358 topology_manager.go:138] "Creating topology manager with none policy" May 27 18:08:29.731829 kubelet[4358]: I0527 18:08:29.731761 4358 container_manager_linux.go:300] "Creating device plugin manager" May 27 18:08:29.731829 kubelet[4358]: I0527 18:08:29.731792 4358 state_mem.go:36] "Initialized new in-memory state store" May 27 18:08:29.731928 kubelet[4358]: I0527 18:08:29.731875 4358 kubelet.go:408] "Attempting to sync node with API server" May 27 18:08:29.731928 kubelet[4358]: I0527 18:08:29.731889 4358 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 18:08:29.731928 kubelet[4358]: I0527 18:08:29.731905 4358 kubelet.go:314] "Adding apiserver pod source" May 27 18:08:29.731928 kubelet[4358]: I0527 18:08:29.731914 4358 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 18:08:29.732340 kubelet[4358]: I0527 18:08:29.732319 4358 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 18:08:29.733926 kubelet[4358]: I0527 18:08:29.733904 4358 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 18:08:29.734312 kubelet[4358]: I0527 18:08:29.734299 4358 server.go:1274] "Started kubelet" May 27 18:08:29.734408 kubelet[4358]: I0527 18:08:29.734344 4358 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 27 18:08:29.734484 kubelet[4358]: I0527 18:08:29.734439 4358 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 18:08:29.734675 kubelet[4358]: I0527 18:08:29.734661 4358 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 18:08:29.735296 kubelet[4358]: I0527 18:08:29.735279 4358 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 18:08:29.735296 kubelet[4358]: I0527 18:08:29.735290 4358 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 18:08:29.735341 kubelet[4358]: I0527 18:08:29.735317 4358 volume_manager.go:289] "Starting Kubelet Volume Manager" May 27 18:08:29.735362 kubelet[4358]: E0527 18:08:29.735346 4358 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-e5d745d36c\" not found" May 27 18:08:29.735415 kubelet[4358]: I0527 18:08:29.735388 4358 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 27 18:08:29.735478 kubelet[4358]: I0527 18:08:29.735441 4358 reconciler.go:26] "Reconciler: start to sync state" May 27 18:08:29.735690 kubelet[4358]: E0527 18:08:29.735671 4358 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 18:08:29.735780 kubelet[4358]: I0527 18:08:29.735761 4358 factory.go:221] Registration of the systemd container factory successfully May 27 18:08:29.735880 kubelet[4358]: I0527 18:08:29.735864 4358 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 18:08:29.736481 kubelet[4358]: I0527 18:08:29.736466 4358 server.go:449] "Adding debug handlers to kubelet server" May 27 18:08:29.736608 kubelet[4358]: I0527 18:08:29.736594 4358 factory.go:221] Registration of the containerd container factory successfully May 27 18:08:29.742737 kubelet[4358]: I0527 18:08:29.742706 4358 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 18:08:29.743653 kubelet[4358]: I0527 18:08:29.743637 4358 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 18:08:29.743732 kubelet[4358]: I0527 18:08:29.743656 4358 status_manager.go:217] "Starting to sync pod status with apiserver" May 27 18:08:29.743732 kubelet[4358]: I0527 18:08:29.743675 4358 kubelet.go:2321] "Starting kubelet main sync loop" May 27 18:08:29.743732 kubelet[4358]: E0527 18:08:29.743716 4358 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 18:08:29.765367 kubelet[4358]: I0527 18:08:29.765340 4358 cpu_manager.go:214] "Starting CPU manager" policy="none" May 27 18:08:29.765367 kubelet[4358]: I0527 18:08:29.765357 4358 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 27 18:08:29.765367 kubelet[4358]: I0527 18:08:29.765374 4358 state_mem.go:36] "Initialized new in-memory state store" May 27 18:08:29.765526 kubelet[4358]: I0527 18:08:29.765512 4358 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 18:08:29.765556 kubelet[4358]: I0527 18:08:29.765523 4358 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 18:08:29.765556 kubelet[4358]: I0527 18:08:29.765547 4358 policy_none.go:49] "None policy: Start" May 27 18:08:29.765984 kubelet[4358]: I0527 18:08:29.765971 4358 memory_manager.go:170] "Starting memorymanager" policy="None" May 27 18:08:29.766005 kubelet[4358]: I0527 18:08:29.765995 4358 state_mem.go:35] "Initializing new in-memory state store" May 27 18:08:29.766136 kubelet[4358]: I0527 18:08:29.766125 4358 state_mem.go:75] "Updated machine memory state" May 27 18:08:29.769341 kubelet[4358]: I0527 18:08:29.769321 4358 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 18:08:29.769494 kubelet[4358]: I0527 18:08:29.769482 4358 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 18:08:29.769525 kubelet[4358]: I0527 18:08:29.769494 4358 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 18:08:29.769635 kubelet[4358]: I0527 18:08:29.769619 4358 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 18:08:29.847968 kubelet[4358]: W0527 18:08:29.847940 4358 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 18:08:29.848169 kubelet[4358]: W0527 18:08:29.848151 4358 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 18:08:29.848557 kubelet[4358]: W0527 18:08:29.848544 4358 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 18:08:29.848615 kubelet[4358]: E0527 18:08:29.848597 4358 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" already exists" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.872044 kubelet[4358]: I0527 18:08:29.872030 4358 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.875699 kubelet[4358]: I0527 18:08:29.875673 4358 kubelet_node_status.go:111] "Node was previously registered" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.875746 kubelet[4358]: I0527 18:08:29.875738 4358 kubelet_node_status.go:75] "Successfully registered node" node="ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937002 kubelet[4358]: I0527 18:08:29.936975 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3aa78242bb1927e3b91901392d5918d0-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-a-e5d745d36c\" (UID: \"3aa78242bb1927e3b91901392d5918d0\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937057 kubelet[4358]: I0527 18:08:29.937005 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3aa78242bb1927e3b91901392d5918d0-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-a-e5d745d36c\" (UID: \"3aa78242bb1927e3b91901392d5918d0\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937057 kubelet[4358]: I0527 18:08:29.937025 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3aa78242bb1927e3b91901392d5918d0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-a-e5d745d36c\" (UID: \"3aa78242bb1927e3b91901392d5918d0\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937057 kubelet[4358]: I0527 18:08:29.937042 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937169 kubelet[4358]: I0527 18:08:29.937058 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/73db487bdeb2025626e402cab091d2c9-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-a-e5d745d36c\" (UID: \"73db487bdeb2025626e402cab091d2c9\") " pod="kube-system/kube-scheduler-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937169 kubelet[4358]: I0527 18:08:29.937078 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937169 kubelet[4358]: I0527 18:08:29.937097 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937169 kubelet[4358]: I0527 18:08:29.937157 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:29.937315 kubelet[4358]: I0527 18:08:29.937200 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/11d3ac03b7a3ab8910deac3e7e920238-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-a-e5d745d36c\" (UID: \"11d3ac03b7a3ab8910deac3e7e920238\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" May 27 18:08:30.732851 kubelet[4358]: I0527 18:08:30.732816 4358 apiserver.go:52] "Watching apiserver" May 27 18:08:30.735969 kubelet[4358]: I0527 18:08:30.735948 4358 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 27 18:08:30.754960 kubelet[4358]: W0527 18:08:30.754932 4358 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 18:08:30.755033 kubelet[4358]: E0527 18:08:30.755003 4358 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4344.0.0-a-e5d745d36c\" already exists" pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" May 27 18:08:30.765519 kubelet[4358]: I0527 18:08:30.765475 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344.0.0-a-e5d745d36c" podStartSLOduration=1.765464944 podStartE2EDuration="1.765464944s" podCreationTimestamp="2025-05-27 18:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:08:30.765272205 +0000 UTC m=+1.091399127" watchObservedRunningTime="2025-05-27 18:08:30.765464944 +0000 UTC m=+1.091591866" May 27 18:08:30.775190 kubelet[4358]: I0527 18:08:30.775148 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344.0.0-a-e5d745d36c" podStartSLOduration=1.775133035 podStartE2EDuration="1.775133035s" podCreationTimestamp="2025-05-27 18:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:08:30.770090688 +0000 UTC m=+1.096217610" watchObservedRunningTime="2025-05-27 18:08:30.775133035 +0000 UTC m=+1.101259957" May 27 18:08:30.780362 kubelet[4358]: I0527 18:08:30.780314 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-e5d745d36c" podStartSLOduration=1.78030048 podStartE2EDuration="1.78030048s" podCreationTimestamp="2025-05-27 18:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:08:30.775086808 +0000 UTC m=+1.101213730" watchObservedRunningTime="2025-05-27 18:08:30.78030048 +0000 UTC m=+1.106427401" May 27 18:08:34.989782 kubelet[4358]: I0527 18:08:34.989739 4358 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 18:08:34.990164 kubelet[4358]: I0527 18:08:34.990153 4358 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 18:08:34.990194 containerd[2775]: time="2025-05-27T18:08:34.990010203Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 18:08:35.174852 systemd[1]: Created slice kubepods-besteffort-pod7e606ec5_1638_4263_a113_ea84f51f2c07.slice - libcontainer container kubepods-besteffort-pod7e606ec5_1638_4263_a113_ea84f51f2c07.slice. May 27 18:08:35.278930 kubelet[4358]: I0527 18:08:35.278845 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7e606ec5-1638-4263-a113-ea84f51f2c07-xtables-lock\") pod \"kube-proxy-bvlbf\" (UID: \"7e606ec5-1638-4263-a113-ea84f51f2c07\") " pod="kube-system/kube-proxy-bvlbf" May 27 18:08:35.278930 kubelet[4358]: I0527 18:08:35.278901 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e606ec5-1638-4263-a113-ea84f51f2c07-lib-modules\") pod \"kube-proxy-bvlbf\" (UID: \"7e606ec5-1638-4263-a113-ea84f51f2c07\") " pod="kube-system/kube-proxy-bvlbf" May 27 18:08:35.279048 kubelet[4358]: I0527 18:08:35.278935 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7e606ec5-1638-4263-a113-ea84f51f2c07-kube-proxy\") pod \"kube-proxy-bvlbf\" (UID: \"7e606ec5-1638-4263-a113-ea84f51f2c07\") " pod="kube-system/kube-proxy-bvlbf" May 27 18:08:35.279048 kubelet[4358]: I0527 18:08:35.278963 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n89fx\" (UniqueName: \"kubernetes.io/projected/7e606ec5-1638-4263-a113-ea84f51f2c07-kube-api-access-n89fx\") pod \"kube-proxy-bvlbf\" (UID: \"7e606ec5-1638-4263-a113-ea84f51f2c07\") " pod="kube-system/kube-proxy-bvlbf" May 27 18:08:35.386030 kubelet[4358]: E0527 18:08:35.386005 4358 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found May 27 18:08:35.386068 kubelet[4358]: E0527 18:08:35.386031 4358 projected.go:194] Error preparing data for projected volume kube-api-access-n89fx for pod kube-system/kube-proxy-bvlbf: configmap "kube-root-ca.crt" not found May 27 18:08:35.386090 kubelet[4358]: E0527 18:08:35.386079 4358 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e606ec5-1638-4263-a113-ea84f51f2c07-kube-api-access-n89fx podName:7e606ec5-1638-4263-a113-ea84f51f2c07 nodeName:}" failed. No retries permitted until 2025-05-27 18:08:35.886062278 +0000 UTC m=+6.212189200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n89fx" (UniqueName: "kubernetes.io/projected/7e606ec5-1638-4263-a113-ea84f51f2c07-kube-api-access-n89fx") pod "kube-proxy-bvlbf" (UID: "7e606ec5-1638-4263-a113-ea84f51f2c07") : configmap "kube-root-ca.crt" not found May 27 18:08:36.099008 containerd[2775]: time="2025-05-27T18:08:36.098967751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bvlbf,Uid:7e606ec5-1638-4263-a113-ea84f51f2c07,Namespace:kube-system,Attempt:0,}" May 27 18:08:36.119702 containerd[2775]: time="2025-05-27T18:08:36.119663165Z" level=info msg="connecting to shim fe9a751502649b3264ba4cdd502f3c7e702989f8bdc3e742a1a1b1356035170e" address="unix:///run/containerd/s/0d28fcf37591b699cb245ff9697716d8791878d545fe9adf218b404bc23e4adb" namespace=k8s.io protocol=ttrpc version=3 May 27 18:08:36.151721 systemd[1]: Started cri-containerd-fe9a751502649b3264ba4cdd502f3c7e702989f8bdc3e742a1a1b1356035170e.scope - libcontainer container fe9a751502649b3264ba4cdd502f3c7e702989f8bdc3e742a1a1b1356035170e. May 27 18:08:36.153979 systemd[1]: Created slice kubepods-besteffort-pod0097f39f_5376_47ac_90bb_9121e212a336.slice - libcontainer container kubepods-besteffort-pod0097f39f_5376_47ac_90bb_9121e212a336.slice. May 27 18:08:36.169625 containerd[2775]: time="2025-05-27T18:08:36.169592607Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bvlbf,Uid:7e606ec5-1638-4263-a113-ea84f51f2c07,Namespace:kube-system,Attempt:0,} returns sandbox id \"fe9a751502649b3264ba4cdd502f3c7e702989f8bdc3e742a1a1b1356035170e\"" May 27 18:08:36.171499 containerd[2775]: time="2025-05-27T18:08:36.171473281Z" level=info msg="CreateContainer within sandbox \"fe9a751502649b3264ba4cdd502f3c7e702989f8bdc3e742a1a1b1356035170e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 18:08:36.178976 containerd[2775]: time="2025-05-27T18:08:36.178942406Z" level=info msg="Container d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:36.182774 containerd[2775]: time="2025-05-27T18:08:36.182740696Z" level=info msg="CreateContainer within sandbox \"fe9a751502649b3264ba4cdd502f3c7e702989f8bdc3e742a1a1b1356035170e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7\"" May 27 18:08:36.183229 containerd[2775]: time="2025-05-27T18:08:36.183205897Z" level=info msg="StartContainer for \"d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7\"" May 27 18:08:36.184444 containerd[2775]: time="2025-05-27T18:08:36.184423911Z" level=info msg="connecting to shim d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7" address="unix:///run/containerd/s/0d28fcf37591b699cb245ff9697716d8791878d545fe9adf218b404bc23e4adb" protocol=ttrpc version=3 May 27 18:08:36.185336 kubelet[4358]: I0527 18:08:36.185310 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cglww\" (UniqueName: \"kubernetes.io/projected/0097f39f-5376-47ac-90bb-9121e212a336-kube-api-access-cglww\") pod \"tigera-operator-7c5755cdcb-449rd\" (UID: \"0097f39f-5376-47ac-90bb-9121e212a336\") " pod="tigera-operator/tigera-operator-7c5755cdcb-449rd" May 27 18:08:36.185552 kubelet[4358]: I0527 18:08:36.185355 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0097f39f-5376-47ac-90bb-9121e212a336-var-lib-calico\") pod \"tigera-operator-7c5755cdcb-449rd\" (UID: \"0097f39f-5376-47ac-90bb-9121e212a336\") " pod="tigera-operator/tigera-operator-7c5755cdcb-449rd" May 27 18:08:36.215751 systemd[1]: Started cri-containerd-d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7.scope - libcontainer container d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7. May 27 18:08:36.250001 containerd[2775]: time="2025-05-27T18:08:36.249972693Z" level=info msg="StartContainer for \"d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7\" returns successfully" May 27 18:08:36.456526 containerd[2775]: time="2025-05-27T18:08:36.456421126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-449rd,Uid:0097f39f-5376-47ac-90bb-9121e212a336,Namespace:tigera-operator,Attempt:0,}" May 27 18:08:36.464647 containerd[2775]: time="2025-05-27T18:08:36.464618877Z" level=info msg="connecting to shim bf3789de9882a71390460751ebcd67cebad4452a5aeb00e386c0e6d4db966928" address="unix:///run/containerd/s/f5dc7a383985ada67d582a82f910e7e188568b54d1eb883c1903a20f015d2e54" namespace=k8s.io protocol=ttrpc version=3 May 27 18:08:36.494777 systemd[1]: Started cri-containerd-bf3789de9882a71390460751ebcd67cebad4452a5aeb00e386c0e6d4db966928.scope - libcontainer container bf3789de9882a71390460751ebcd67cebad4452a5aeb00e386c0e6d4db966928. May 27 18:08:36.520481 containerd[2775]: time="2025-05-27T18:08:36.520455765Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-449rd,Uid:0097f39f-5376-47ac-90bb-9121e212a336,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"bf3789de9882a71390460751ebcd67cebad4452a5aeb00e386c0e6d4db966928\"" May 27 18:08:36.521722 containerd[2775]: time="2025-05-27T18:08:36.521702685Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 18:08:36.768534 kubelet[4358]: I0527 18:08:36.768434 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-bvlbf" podStartSLOduration=1.7684157630000001 podStartE2EDuration="1.768415763s" podCreationTimestamp="2025-05-27 18:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:08:36.768332486 +0000 UTC m=+7.094459407" watchObservedRunningTime="2025-05-27 18:08:36.768415763 +0000 UTC m=+7.094542685" May 27 18:08:38.933942 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3314394005.mount: Deactivated successfully. May 27 18:08:39.925327 containerd[2775]: time="2025-05-27T18:08:39.925238457Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:39.925327 containerd[2775]: time="2025-05-27T18:08:39.925237817Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=22143480" May 27 18:08:39.926029 containerd[2775]: time="2025-05-27T18:08:39.925978591Z" level=info msg="ImageCreate event name:\"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:39.927628 containerd[2775]: time="2025-05-27T18:08:39.927601301Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:39.928368 containerd[2775]: time="2025-05-27T18:08:39.928292194Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"22139475\" in 3.406563156s" May 27 18:08:39.928368 containerd[2775]: time="2025-05-27T18:08:39.928322354Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:171854d50ba608218142ad5d32c7dd12ce55d536f02872e56e7c04c1f0a96a6b\"" May 27 18:08:39.929958 containerd[2775]: time="2025-05-27T18:08:39.929872583Z" level=info msg="CreateContainer within sandbox \"bf3789de9882a71390460751ebcd67cebad4452a5aeb00e386c0e6d4db966928\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 18:08:39.945555 containerd[2775]: time="2025-05-27T18:08:39.945526233Z" level=info msg="Container f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:39.948210 containerd[2775]: time="2025-05-27T18:08:39.948186602Z" level=info msg="CreateContainer within sandbox \"bf3789de9882a71390460751ebcd67cebad4452a5aeb00e386c0e6d4db966928\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579\"" May 27 18:08:39.948744 containerd[2775]: time="2025-05-27T18:08:39.948697332Z" level=info msg="StartContainer for \"f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579\"" May 27 18:08:39.949444 containerd[2775]: time="2025-05-27T18:08:39.949422865Z" level=info msg="connecting to shim f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579" address="unix:///run/containerd/s/f5dc7a383985ada67d582a82f910e7e188568b54d1eb883c1903a20f015d2e54" protocol=ttrpc version=3 May 27 18:08:39.979758 systemd[1]: Started cri-containerd-f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579.scope - libcontainer container f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579. May 27 18:08:39.999151 containerd[2775]: time="2025-05-27T18:08:39.999119746Z" level=info msg="StartContainer for \"f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579\" returns successfully" May 27 18:08:40.785853 kubelet[4358]: I0527 18:08:40.785788 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7c5755cdcb-449rd" podStartSLOduration=1.378194892 podStartE2EDuration="4.785770475s" podCreationTimestamp="2025-05-27 18:08:36 +0000 UTC" firstStartedPulling="2025-05-27 18:08:36.521358942 +0000 UTC m=+6.847485824" lastFinishedPulling="2025-05-27 18:08:39.928934485 +0000 UTC m=+10.255061407" observedRunningTime="2025-05-27 18:08:40.785609152 +0000 UTC m=+11.111736074" watchObservedRunningTime="2025-05-27 18:08:40.785770475 +0000 UTC m=+11.111897397" May 27 18:08:44.886464 sudo[3097]: pam_unix(sudo:session): session closed for user root May 27 18:08:44.948519 sshd[3096]: Connection closed by 139.178.89.65 port 32854 May 27 18:08:44.948890 sshd-session[3094]: pam_unix(sshd:session): session closed for user core May 27 18:08:44.952214 systemd[1]: sshd@8-147.28.228.207:22-139.178.89.65:32854.service: Deactivated successfully. May 27 18:08:44.953894 systemd[1]: session-11.scope: Deactivated successfully. May 27 18:08:44.954103 systemd[1]: session-11.scope: Consumed 8.180s CPU time, 244.8M memory peak. May 27 18:08:44.955134 systemd-logind[2754]: Session 11 logged out. Waiting for processes to exit. May 27 18:08:44.955967 systemd-logind[2754]: Removed session 11. May 27 18:08:49.537205 systemd[1]: Created slice kubepods-besteffort-pod266c06c3_e6c2_4db4_aa78_fb241b8b5562.slice - libcontainer container kubepods-besteffort-pod266c06c3_e6c2_4db4_aa78_fb241b8b5562.slice. May 27 18:08:49.564464 kubelet[4358]: I0527 18:08:49.564419 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfhlf\" (UniqueName: \"kubernetes.io/projected/266c06c3-e6c2-4db4-aa78-fb241b8b5562-kube-api-access-nfhlf\") pod \"calico-typha-599544fd5f-xdg8j\" (UID: \"266c06c3-e6c2-4db4-aa78-fb241b8b5562\") " pod="calico-system/calico-typha-599544fd5f-xdg8j" May 27 18:08:49.564912 kubelet[4358]: I0527 18:08:49.564843 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/266c06c3-e6c2-4db4-aa78-fb241b8b5562-tigera-ca-bundle\") pod \"calico-typha-599544fd5f-xdg8j\" (UID: \"266c06c3-e6c2-4db4-aa78-fb241b8b5562\") " pod="calico-system/calico-typha-599544fd5f-xdg8j" May 27 18:08:49.564912 kubelet[4358]: I0527 18:08:49.564870 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/266c06c3-e6c2-4db4-aa78-fb241b8b5562-typha-certs\") pod \"calico-typha-599544fd5f-xdg8j\" (UID: \"266c06c3-e6c2-4db4-aa78-fb241b8b5562\") " pod="calico-system/calico-typha-599544fd5f-xdg8j" May 27 18:08:49.729783 systemd[1]: Created slice kubepods-besteffort-pod3b4d22f7_9bd2_450f_bfaa_30e9a1e0812f.slice - libcontainer container kubepods-besteffort-pod3b4d22f7_9bd2_450f_bfaa_30e9a1e0812f.slice. May 27 18:08:49.765702 kubelet[4358]: I0527 18:08:49.765662 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-cni-bin-dir\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.765890 kubelet[4358]: I0527 18:08:49.765738 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-lib-modules\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.765890 kubelet[4358]: I0527 18:08:49.765775 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-tigera-ca-bundle\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.765890 kubelet[4358]: I0527 18:08:49.765807 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-var-run-calico\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.765890 kubelet[4358]: I0527 18:08:49.765834 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-node-certs\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.765890 kubelet[4358]: I0527 18:08:49.765864 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-var-lib-calico\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.766026 kubelet[4358]: I0527 18:08:49.765890 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-cni-log-dir\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.766026 kubelet[4358]: I0527 18:08:49.765917 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-cni-net-dir\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.766026 kubelet[4358]: I0527 18:08:49.765948 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2lnf\" (UniqueName: \"kubernetes.io/projected/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-kube-api-access-f2lnf\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.766026 kubelet[4358]: I0527 18:08:49.765982 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-flexvol-driver-host\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.766026 kubelet[4358]: I0527 18:08:49.766003 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-xtables-lock\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.766120 kubelet[4358]: I0527 18:08:49.766023 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f-policysync\") pod \"calico-node-jwzbk\" (UID: \"3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f\") " pod="calico-system/calico-node-jwzbk" May 27 18:08:49.841001 containerd[2775]: time="2025-05-27T18:08:49.840964796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-599544fd5f-xdg8j,Uid:266c06c3-e6c2-4db4-aa78-fb241b8b5562,Namespace:calico-system,Attempt:0,}" May 27 18:08:49.849305 containerd[2775]: time="2025-05-27T18:08:49.849279046Z" level=info msg="connecting to shim 76e0816753c2b0168fa2936de4848e2daa57435ccdb674d7f928ebc9fcb5882f" address="unix:///run/containerd/s/8575750898131bc081e6558e224845b0d34483facef0fbef3327e487805375f4" namespace=k8s.io protocol=ttrpc version=3 May 27 18:08:49.867843 kubelet[4358]: E0527 18:08:49.867821 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.867843 kubelet[4358]: W0527 18:08:49.867840 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.867938 kubelet[4358]: E0527 18:08:49.867860 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.871766 kubelet[4358]: E0527 18:08:49.871749 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.871792 kubelet[4358]: W0527 18:08:49.871766 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.871792 kubelet[4358]: E0527 18:08:49.871781 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.875419 kubelet[4358]: E0527 18:08:49.875407 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.875419 kubelet[4358]: W0527 18:08:49.875418 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.875482 kubelet[4358]: E0527 18:08:49.875429 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.886768 systemd[1]: Started cri-containerd-76e0816753c2b0168fa2936de4848e2daa57435ccdb674d7f928ebc9fcb5882f.scope - libcontainer container 76e0816753c2b0168fa2936de4848e2daa57435ccdb674d7f928ebc9fcb5882f. May 27 18:08:49.912250 containerd[2775]: time="2025-05-27T18:08:49.912221482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-599544fd5f-xdg8j,Uid:266c06c3-e6c2-4db4-aa78-fb241b8b5562,Namespace:calico-system,Attempt:0,} returns sandbox id \"76e0816753c2b0168fa2936de4848e2daa57435ccdb674d7f928ebc9fcb5882f\"" May 27 18:08:49.913278 containerd[2775]: time="2025-05-27T18:08:49.913259334Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 18:08:49.923669 kubelet[4358]: E0527 18:08:49.923638 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-v6s4f" podUID="6d206e1a-532c-4375-9afc-8ff133d58b02" May 27 18:08:49.952982 kubelet[4358]: E0527 18:08:49.952964 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.953032 kubelet[4358]: W0527 18:08:49.952982 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.953032 kubelet[4358]: E0527 18:08:49.953000 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.953242 kubelet[4358]: E0527 18:08:49.953233 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.953242 kubelet[4358]: W0527 18:08:49.953241 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.953293 kubelet[4358]: E0527 18:08:49.953249 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.953419 kubelet[4358]: E0527 18:08:49.953410 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.953443 kubelet[4358]: W0527 18:08:49.953418 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.953443 kubelet[4358]: E0527 18:08:49.953426 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.953610 kubelet[4358]: E0527 18:08:49.953601 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.953610 kubelet[4358]: W0527 18:08:49.953609 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.953656 kubelet[4358]: E0527 18:08:49.953617 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.953760 kubelet[4358]: E0527 18:08:49.953751 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.953780 kubelet[4358]: W0527 18:08:49.953759 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.953780 kubelet[4358]: E0527 18:08:49.953768 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.953945 kubelet[4358]: E0527 18:08:49.953937 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.953967 kubelet[4358]: W0527 18:08:49.953945 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.953967 kubelet[4358]: E0527 18:08:49.953952 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.954104 kubelet[4358]: E0527 18:08:49.954096 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.954126 kubelet[4358]: W0527 18:08:49.954103 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.954126 kubelet[4358]: E0527 18:08:49.954111 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.954262 kubelet[4358]: E0527 18:08:49.954254 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.954284 kubelet[4358]: W0527 18:08:49.954261 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.954284 kubelet[4358]: E0527 18:08:49.954268 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.954430 kubelet[4358]: E0527 18:08:49.954422 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.954456 kubelet[4358]: W0527 18:08:49.954429 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.954456 kubelet[4358]: E0527 18:08:49.954437 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.954610 kubelet[4358]: E0527 18:08:49.954599 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.954610 kubelet[4358]: W0527 18:08:49.954607 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.954662 kubelet[4358]: E0527 18:08:49.954616 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.954742 kubelet[4358]: E0527 18:08:49.954733 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.954742 kubelet[4358]: W0527 18:08:49.954741 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.954821 kubelet[4358]: E0527 18:08:49.954748 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.954869 kubelet[4358]: E0527 18:08:49.954861 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.954889 kubelet[4358]: W0527 18:08:49.954868 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.954889 kubelet[4358]: E0527 18:08:49.954875 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.955234 kubelet[4358]: E0527 18:08:49.955221 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.955258 kubelet[4358]: W0527 18:08:49.955235 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.955258 kubelet[4358]: E0527 18:08:49.955250 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.955469 kubelet[4358]: E0527 18:08:49.955461 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.955491 kubelet[4358]: W0527 18:08:49.955469 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.955491 kubelet[4358]: E0527 18:08:49.955478 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.955642 kubelet[4358]: E0527 18:08:49.955634 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.955662 kubelet[4358]: W0527 18:08:49.955641 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.955662 kubelet[4358]: E0527 18:08:49.955650 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.955802 kubelet[4358]: E0527 18:08:49.955793 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.955826 kubelet[4358]: W0527 18:08:49.955802 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.955826 kubelet[4358]: E0527 18:08:49.955811 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.955956 kubelet[4358]: E0527 18:08:49.955947 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.955980 kubelet[4358]: W0527 18:08:49.955956 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.955980 kubelet[4358]: E0527 18:08:49.955964 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.956115 kubelet[4358]: E0527 18:08:49.956106 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.956137 kubelet[4358]: W0527 18:08:49.956114 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.956137 kubelet[4358]: E0527 18:08:49.956122 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.956304 kubelet[4358]: E0527 18:08:49.956295 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.956324 kubelet[4358]: W0527 18:08:49.956303 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.956324 kubelet[4358]: E0527 18:08:49.956311 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.956486 kubelet[4358]: E0527 18:08:49.956477 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.956506 kubelet[4358]: W0527 18:08:49.956487 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.956506 kubelet[4358]: E0527 18:08:49.956496 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.967825 kubelet[4358]: E0527 18:08:49.967802 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.967825 kubelet[4358]: W0527 18:08:49.967816 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.967932 kubelet[4358]: E0527 18:08:49.967829 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.967932 kubelet[4358]: I0527 18:08:49.967855 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zgm\" (UniqueName: \"kubernetes.io/projected/6d206e1a-532c-4375-9afc-8ff133d58b02-kube-api-access-s9zgm\") pod \"csi-node-driver-v6s4f\" (UID: \"6d206e1a-532c-4375-9afc-8ff133d58b02\") " pod="calico-system/csi-node-driver-v6s4f" May 27 18:08:49.968009 kubelet[4358]: E0527 18:08:49.967990 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.968009 kubelet[4358]: W0527 18:08:49.967999 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.968086 kubelet[4358]: E0527 18:08:49.968011 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.968086 kubelet[4358]: I0527 18:08:49.968024 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d206e1a-532c-4375-9afc-8ff133d58b02-kubelet-dir\") pod \"csi-node-driver-v6s4f\" (UID: \"6d206e1a-532c-4375-9afc-8ff133d58b02\") " pod="calico-system/csi-node-driver-v6s4f" May 27 18:08:49.968164 kubelet[4358]: E0527 18:08:49.968151 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.968164 kubelet[4358]: W0527 18:08:49.968159 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.968202 kubelet[4358]: E0527 18:08:49.968170 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.968202 kubelet[4358]: I0527 18:08:49.968183 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d206e1a-532c-4375-9afc-8ff133d58b02-socket-dir\") pod \"csi-node-driver-v6s4f\" (UID: \"6d206e1a-532c-4375-9afc-8ff133d58b02\") " pod="calico-system/csi-node-driver-v6s4f" May 27 18:08:49.968315 kubelet[4358]: E0527 18:08:49.968302 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.968315 kubelet[4358]: W0527 18:08:49.968311 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.968358 kubelet[4358]: E0527 18:08:49.968322 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.968358 kubelet[4358]: I0527 18:08:49.968338 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6d206e1a-532c-4375-9afc-8ff133d58b02-varrun\") pod \"csi-node-driver-v6s4f\" (UID: \"6d206e1a-532c-4375-9afc-8ff133d58b02\") " pod="calico-system/csi-node-driver-v6s4f" May 27 18:08:49.968533 kubelet[4358]: E0527 18:08:49.968521 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.968533 kubelet[4358]: W0527 18:08:49.968530 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.968571 kubelet[4358]: E0527 18:08:49.968542 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.968571 kubelet[4358]: I0527 18:08:49.968556 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d206e1a-532c-4375-9afc-8ff133d58b02-registration-dir\") pod \"csi-node-driver-v6s4f\" (UID: \"6d206e1a-532c-4375-9afc-8ff133d58b02\") " pod="calico-system/csi-node-driver-v6s4f" May 27 18:08:49.968737 kubelet[4358]: E0527 18:08:49.968723 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.968737 kubelet[4358]: W0527 18:08:49.968734 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.968781 kubelet[4358]: E0527 18:08:49.968746 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.968869 kubelet[4358]: E0527 18:08:49.968859 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.968869 kubelet[4358]: W0527 18:08:49.968866 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.968914 kubelet[4358]: E0527 18:08:49.968877 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.969056 kubelet[4358]: E0527 18:08:49.969046 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.969056 kubelet[4358]: W0527 18:08:49.969053 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.969096 kubelet[4358]: E0527 18:08:49.969065 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.969198 kubelet[4358]: E0527 18:08:49.969188 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.969198 kubelet[4358]: W0527 18:08:49.969196 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.969239 kubelet[4358]: E0527 18:08:49.969207 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.969349 kubelet[4358]: E0527 18:08:49.969341 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.969369 kubelet[4358]: W0527 18:08:49.969349 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.969369 kubelet[4358]: E0527 18:08:49.969360 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.969503 kubelet[4358]: E0527 18:08:49.969495 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.969524 kubelet[4358]: W0527 18:08:49.969502 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.969545 kubelet[4358]: E0527 18:08:49.969520 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.969725 kubelet[4358]: E0527 18:08:49.969714 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.969725 kubelet[4358]: W0527 18:08:49.969722 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.969772 kubelet[4358]: E0527 18:08:49.969737 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.969861 kubelet[4358]: E0527 18:08:49.969851 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.969861 kubelet[4358]: W0527 18:08:49.969859 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.969904 kubelet[4358]: E0527 18:08:49.969869 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.970033 kubelet[4358]: E0527 18:08:49.970022 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.970033 kubelet[4358]: W0527 18:08:49.970030 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.970070 kubelet[4358]: E0527 18:08:49.970037 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:49.970217 kubelet[4358]: E0527 18:08:49.970209 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:49.970239 kubelet[4358]: W0527 18:08:49.970217 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:49.970239 kubelet[4358]: E0527 18:08:49.970226 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.032714 containerd[2775]: time="2025-05-27T18:08:50.032685800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jwzbk,Uid:3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f,Namespace:calico-system,Attempt:0,}" May 27 18:08:50.040860 containerd[2775]: time="2025-05-27T18:08:50.040834164Z" level=info msg="connecting to shim 42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92" address="unix:///run/containerd/s/628188dd7d6a85bd1928c5471c419e3fbfee3c4b0c8fe2dd0746d9f3b9b7f9bd" namespace=k8s.io protocol=ttrpc version=3 May 27 18:08:50.069904 kubelet[4358]: E0527 18:08:50.069886 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.069904 kubelet[4358]: W0527 18:08:50.069902 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.069989 kubelet[4358]: E0527 18:08:50.069919 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.070115 kubelet[4358]: E0527 18:08:50.070105 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.070141 kubelet[4358]: W0527 18:08:50.070114 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.070141 kubelet[4358]: E0527 18:08:50.070127 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.070342 kubelet[4358]: E0527 18:08:50.070334 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.070366 kubelet[4358]: W0527 18:08:50.070342 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.070388 kubelet[4358]: E0527 18:08:50.070354 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.070569 kubelet[4358]: E0527 18:08:50.070556 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.070569 kubelet[4358]: W0527 18:08:50.070564 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.070634 kubelet[4358]: E0527 18:08:50.070576 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.070741 kubelet[4358]: E0527 18:08:50.070728 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.070741 kubelet[4358]: W0527 18:08:50.070737 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.070787 kubelet[4358]: E0527 18:08:50.070751 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.070802 systemd[1]: Started cri-containerd-42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92.scope - libcontainer container 42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92. May 27 18:08:50.070949 kubelet[4358]: E0527 18:08:50.070936 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.070949 kubelet[4358]: W0527 18:08:50.070945 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.070996 kubelet[4358]: E0527 18:08:50.070958 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.071125 kubelet[4358]: E0527 18:08:50.071112 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.071125 kubelet[4358]: W0527 18:08:50.071120 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.071173 kubelet[4358]: E0527 18:08:50.071141 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.071294 kubelet[4358]: E0527 18:08:50.071284 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.071294 kubelet[4358]: W0527 18:08:50.071293 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.071337 kubelet[4358]: E0527 18:08:50.071309 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.071452 kubelet[4358]: E0527 18:08:50.071443 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.071473 kubelet[4358]: W0527 18:08:50.071451 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.071497 kubelet[4358]: E0527 18:08:50.071478 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.071625 kubelet[4358]: E0527 18:08:50.071617 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.071655 kubelet[4358]: W0527 18:08:50.071625 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.071655 kubelet[4358]: E0527 18:08:50.071644 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.071754 kubelet[4358]: E0527 18:08:50.071745 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.071774 kubelet[4358]: W0527 18:08:50.071753 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.071795 kubelet[4358]: E0527 18:08:50.071775 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.071902 kubelet[4358]: E0527 18:08:50.071894 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.071928 kubelet[4358]: W0527 18:08:50.071902 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.071928 kubelet[4358]: E0527 18:08:50.071913 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.072060 kubelet[4358]: E0527 18:08:50.072051 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.072060 kubelet[4358]: W0527 18:08:50.072059 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.072114 kubelet[4358]: E0527 18:08:50.072070 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.072288 kubelet[4358]: E0527 18:08:50.072275 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.072309 kubelet[4358]: W0527 18:08:50.072288 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.072309 kubelet[4358]: E0527 18:08:50.072305 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.072735 kubelet[4358]: E0527 18:08:50.072526 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.072735 kubelet[4358]: W0527 18:08:50.072535 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.072735 kubelet[4358]: E0527 18:08:50.072553 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.072735 kubelet[4358]: E0527 18:08:50.072724 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.072735 kubelet[4358]: W0527 18:08:50.072733 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.073014 kubelet[4358]: E0527 18:08:50.072749 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.073014 kubelet[4358]: E0527 18:08:50.072904 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.073014 kubelet[4358]: W0527 18:08:50.072911 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.073014 kubelet[4358]: E0527 18:08:50.072926 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.073093 kubelet[4358]: E0527 18:08:50.073073 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.073093 kubelet[4358]: W0527 18:08:50.073080 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.073131 kubelet[4358]: E0527 18:08:50.073095 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.073274 kubelet[4358]: E0527 18:08:50.073263 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.073274 kubelet[4358]: W0527 18:08:50.073271 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.073315 kubelet[4358]: E0527 18:08:50.073286 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.073438 kubelet[4358]: E0527 18:08:50.073430 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.073458 kubelet[4358]: W0527 18:08:50.073437 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.073458 kubelet[4358]: E0527 18:08:50.073449 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.073649 kubelet[4358]: E0527 18:08:50.073640 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.073673 kubelet[4358]: W0527 18:08:50.073649 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.073673 kubelet[4358]: E0527 18:08:50.073660 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.073933 kubelet[4358]: E0527 18:08:50.073915 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.073933 kubelet[4358]: W0527 18:08:50.073931 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.073988 kubelet[4358]: E0527 18:08:50.073946 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.074158 kubelet[4358]: E0527 18:08:50.074146 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.074158 kubelet[4358]: W0527 18:08:50.074154 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.074200 kubelet[4358]: E0527 18:08:50.074167 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.074420 kubelet[4358]: E0527 18:08:50.074353 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.074420 kubelet[4358]: W0527 18:08:50.074363 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.074420 kubelet[4358]: E0527 18:08:50.074372 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.074673 kubelet[4358]: E0527 18:08:50.074657 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.074673 kubelet[4358]: W0527 18:08:50.074668 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.074730 kubelet[4358]: E0527 18:08:50.074681 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.082513 kubelet[4358]: E0527 18:08:50.082496 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.082513 kubelet[4358]: W0527 18:08:50.082509 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.082619 kubelet[4358]: E0527 18:08:50.082523 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.088474 containerd[2775]: time="2025-05-27T18:08:50.088440570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jwzbk,Uid:3b4d22f7-9bd2-450f-bfaa-30e9a1e0812f,Namespace:calico-system,Attempt:0,} returns sandbox id \"42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92\"" May 27 18:08:50.590207 containerd[2775]: time="2025-05-27T18:08:50.590162215Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:50.590462 containerd[2775]: time="2025-05-27T18:08:50.590436618Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=33020269" May 27 18:08:50.590830 containerd[2775]: time="2025-05-27T18:08:50.590809582Z" level=info msg="ImageCreate event name:\"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:50.592398 containerd[2775]: time="2025-05-27T18:08:50.592375238Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:50.593001 containerd[2775]: time="2025-05-27T18:08:50.592978004Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"33020123\" in 679.69115ms" May 27 18:08:50.593030 containerd[2775]: time="2025-05-27T18:08:50.593002004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:05ca98cdd7b8267a0dc5550048c0a195c8d42f85d92f090a669493485d8a6beb\"" May 27 18:08:50.593737 containerd[2775]: time="2025-05-27T18:08:50.593717132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 18:08:50.598132 containerd[2775]: time="2025-05-27T18:08:50.598100776Z" level=info msg="CreateContainer within sandbox \"76e0816753c2b0168fa2936de4848e2daa57435ccdb674d7f928ebc9fcb5882f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 18:08:50.603902 containerd[2775]: time="2025-05-27T18:08:50.603857875Z" level=info msg="Container 96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:50.607264 containerd[2775]: time="2025-05-27T18:08:50.607237270Z" level=info msg="CreateContainer within sandbox \"76e0816753c2b0168fa2936de4848e2daa57435ccdb674d7f928ebc9fcb5882f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540\"" May 27 18:08:50.607881 containerd[2775]: time="2025-05-27T18:08:50.607547673Z" level=info msg="StartContainer for \"96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540\"" May 27 18:08:50.608593 containerd[2775]: time="2025-05-27T18:08:50.608558243Z" level=info msg="connecting to shim 96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540" address="unix:///run/containerd/s/8575750898131bc081e6558e224845b0d34483facef0fbef3327e487805375f4" protocol=ttrpc version=3 May 27 18:08:50.631705 systemd[1]: Started cri-containerd-96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540.scope - libcontainer container 96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540. May 27 18:08:50.660091 containerd[2775]: time="2025-05-27T18:08:50.660065569Z" level=info msg="StartContainer for \"96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540\" returns successfully" May 27 18:08:50.788528 kubelet[4358]: I0527 18:08:50.788455 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-599544fd5f-xdg8j" podStartSLOduration=1.107961203 podStartE2EDuration="1.788441001s" podCreationTimestamp="2025-05-27 18:08:49 +0000 UTC" firstStartedPulling="2025-05-27 18:08:49.913071052 +0000 UTC m=+20.239197974" lastFinishedPulling="2025-05-27 18:08:50.59355085 +0000 UTC m=+20.919677772" observedRunningTime="2025-05-27 18:08:50.788175718 +0000 UTC m=+21.114302640" watchObservedRunningTime="2025-05-27 18:08:50.788441001 +0000 UTC m=+21.114567923" May 27 18:08:50.861660 kubelet[4358]: E0527 18:08:50.861591 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.861660 kubelet[4358]: W0527 18:08:50.861612 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.861660 kubelet[4358]: E0527 18:08:50.861629 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.861793 kubelet[4358]: E0527 18:08:50.861782 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.861793 kubelet[4358]: W0527 18:08:50.861789 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.861843 kubelet[4358]: E0527 18:08:50.861796 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.862107 kubelet[4358]: E0527 18:08:50.862095 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.862211 kubelet[4358]: W0527 18:08:50.862141 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.862233 kubelet[4358]: E0527 18:08:50.862222 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.862457 kubelet[4358]: E0527 18:08:50.862447 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.862478 kubelet[4358]: W0527 18:08:50.862456 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.862478 kubelet[4358]: E0527 18:08:50.862465 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.862631 kubelet[4358]: E0527 18:08:50.862622 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.862631 kubelet[4358]: W0527 18:08:50.862630 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.862690 kubelet[4358]: E0527 18:08:50.862638 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.862817 kubelet[4358]: E0527 18:08:50.862809 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.862817 kubelet[4358]: W0527 18:08:50.862815 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.862858 kubelet[4358]: E0527 18:08:50.862822 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.863021 kubelet[4358]: E0527 18:08:50.863013 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.863041 kubelet[4358]: W0527 18:08:50.863020 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.863041 kubelet[4358]: E0527 18:08:50.863027 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.863218 kubelet[4358]: E0527 18:08:50.863211 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.863238 kubelet[4358]: W0527 18:08:50.863218 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.863238 kubelet[4358]: E0527 18:08:50.863225 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.863434 kubelet[4358]: E0527 18:08:50.863426 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.863454 kubelet[4358]: W0527 18:08:50.863434 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.863454 kubelet[4358]: E0527 18:08:50.863441 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.863618 kubelet[4358]: E0527 18:08:50.863610 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.863640 kubelet[4358]: W0527 18:08:50.863617 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.863640 kubelet[4358]: E0527 18:08:50.863625 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.863750 kubelet[4358]: E0527 18:08:50.863742 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.863770 kubelet[4358]: W0527 18:08:50.863749 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.863770 kubelet[4358]: E0527 18:08:50.863757 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.863875 kubelet[4358]: E0527 18:08:50.863868 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.863898 kubelet[4358]: W0527 18:08:50.863875 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.863898 kubelet[4358]: E0527 18:08:50.863882 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.864005 kubelet[4358]: E0527 18:08:50.863997 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.864027 kubelet[4358]: W0527 18:08:50.864004 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.864027 kubelet[4358]: E0527 18:08:50.864013 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.864180 kubelet[4358]: E0527 18:08:50.864172 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.864200 kubelet[4358]: W0527 18:08:50.864180 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.864200 kubelet[4358]: E0527 18:08:50.864187 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.864350 kubelet[4358]: E0527 18:08:50.864342 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.864372 kubelet[4358]: W0527 18:08:50.864349 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.864372 kubelet[4358]: E0527 18:08:50.864356 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.875636 kubelet[4358]: E0527 18:08:50.875623 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.875665 kubelet[4358]: W0527 18:08:50.875637 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.875665 kubelet[4358]: E0527 18:08:50.875651 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.875828 kubelet[4358]: E0527 18:08:50.875816 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.875828 kubelet[4358]: W0527 18:08:50.875825 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.875872 kubelet[4358]: E0527 18:08:50.875837 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.875978 kubelet[4358]: E0527 18:08:50.875968 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.875978 kubelet[4358]: W0527 18:08:50.875975 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.876022 kubelet[4358]: E0527 18:08:50.875986 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.876198 kubelet[4358]: E0527 18:08:50.876187 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.876198 kubelet[4358]: W0527 18:08:50.876195 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.876240 kubelet[4358]: E0527 18:08:50.876206 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.876387 kubelet[4358]: E0527 18:08:50.876376 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.876387 kubelet[4358]: W0527 18:08:50.876384 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.876429 kubelet[4358]: E0527 18:08:50.876395 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.876645 kubelet[4358]: E0527 18:08:50.876625 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.876667 kubelet[4358]: W0527 18:08:50.876643 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.876667 kubelet[4358]: E0527 18:08:50.876659 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.876809 kubelet[4358]: E0527 18:08:50.876801 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.876834 kubelet[4358]: W0527 18:08:50.876809 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.876834 kubelet[4358]: E0527 18:08:50.876820 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.877022 kubelet[4358]: E0527 18:08:50.877014 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.877043 kubelet[4358]: W0527 18:08:50.877022 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.877043 kubelet[4358]: E0527 18:08:50.877033 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.877259 kubelet[4358]: E0527 18:08:50.877245 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.877259 kubelet[4358]: W0527 18:08:50.877256 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.877304 kubelet[4358]: E0527 18:08:50.877268 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.877446 kubelet[4358]: E0527 18:08:50.877435 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.877446 kubelet[4358]: W0527 18:08:50.877443 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.877485 kubelet[4358]: E0527 18:08:50.877453 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.877630 kubelet[4358]: E0527 18:08:50.877618 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.877630 kubelet[4358]: W0527 18:08:50.877626 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.877676 kubelet[4358]: E0527 18:08:50.877661 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.877749 kubelet[4358]: E0527 18:08:50.877738 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.877749 kubelet[4358]: W0527 18:08:50.877745 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.877790 kubelet[4358]: E0527 18:08:50.877761 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.877889 kubelet[4358]: E0527 18:08:50.877879 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.877889 kubelet[4358]: W0527 18:08:50.877887 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.877932 kubelet[4358]: E0527 18:08:50.877897 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.878143 kubelet[4358]: E0527 18:08:50.878129 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.878168 kubelet[4358]: W0527 18:08:50.878141 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.878168 kubelet[4358]: E0527 18:08:50.878154 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.878316 kubelet[4358]: E0527 18:08:50.878303 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.878316 kubelet[4358]: W0527 18:08:50.878313 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.878360 kubelet[4358]: E0527 18:08:50.878323 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.878615 kubelet[4358]: E0527 18:08:50.878597 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.878638 kubelet[4358]: W0527 18:08:50.878612 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.878638 kubelet[4358]: E0527 18:08:50.878629 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.878803 kubelet[4358]: E0527 18:08:50.878781 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.878803 kubelet[4358]: W0527 18:08:50.878800 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.878848 kubelet[4358]: E0527 18:08:50.878812 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:50.878968 kubelet[4358]: E0527 18:08:50.878959 4358 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:08:50.878989 kubelet[4358]: W0527 18:08:50.878968 4358 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:08:50.878989 kubelet[4358]: E0527 18:08:50.878976 4358 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:08:51.002156 containerd[2775]: time="2025-05-27T18:08:51.002118383Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:51.002474 containerd[2775]: time="2025-05-27T18:08:51.002176903Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4264304" May 27 18:08:51.002786 containerd[2775]: time="2025-05-27T18:08:51.002767309Z" level=info msg="ImageCreate event name:\"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:51.004221 containerd[2775]: time="2025-05-27T18:08:51.004205283Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:51.004822 containerd[2775]: time="2025-05-27T18:08:51.004799769Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5633505\" in 411.056557ms" May 27 18:08:51.004843 containerd[2775]: time="2025-05-27T18:08:51.004828129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:080eaf4c238c85534b61055c31b109c96ce3d20075391e58988541a442c7c701\"" May 27 18:08:51.006196 containerd[2775]: time="2025-05-27T18:08:51.006176022Z" level=info msg="CreateContainer within sandbox \"42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 18:08:51.010255 containerd[2775]: time="2025-05-27T18:08:51.010228662Z" level=info msg="Container a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:51.014173 containerd[2775]: time="2025-05-27T18:08:51.014146860Z" level=info msg="CreateContainer within sandbox \"42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f\"" May 27 18:08:51.014509 containerd[2775]: time="2025-05-27T18:08:51.014489423Z" level=info msg="StartContainer for \"a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f\"" May 27 18:08:51.015770 containerd[2775]: time="2025-05-27T18:08:51.015751195Z" level=info msg="connecting to shim a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f" address="unix:///run/containerd/s/628188dd7d6a85bd1928c5471c419e3fbfee3c4b0c8fe2dd0746d9f3b9b7f9bd" protocol=ttrpc version=3 May 27 18:08:51.046696 systemd[1]: Started cri-containerd-a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f.scope - libcontainer container a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f. May 27 18:08:51.075270 containerd[2775]: time="2025-05-27T18:08:51.075240053Z" level=info msg="StartContainer for \"a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f\" returns successfully" May 27 18:08:51.085006 systemd[1]: cri-containerd-a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f.scope: Deactivated successfully. May 27 18:08:51.086458 containerd[2775]: time="2025-05-27T18:08:51.086426762Z" level=info msg="received exit event container_id:\"a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f\" id:\"a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f\" pid:5383 exited_at:{seconds:1748369331 nanos:86171279}" May 27 18:08:51.086543 containerd[2775]: time="2025-05-27T18:08:51.086436522Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f\" id:\"a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f\" pid:5383 exited_at:{seconds:1748369331 nanos:86171279}" May 27 18:08:51.102266 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f-rootfs.mount: Deactivated successfully. May 27 18:08:51.744773 kubelet[4358]: E0527 18:08:51.744730 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-v6s4f" podUID="6d206e1a-532c-4375-9afc-8ff133d58b02" May 27 18:08:51.784919 containerd[2775]: time="2025-05-27T18:08:51.784887906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 18:08:53.599969 containerd[2775]: time="2025-05-27T18:08:53.599930559Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:53.600315 containerd[2775]: time="2025-05-27T18:08:53.599974599Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=65748976" May 27 18:08:53.600559 containerd[2775]: time="2025-05-27T18:08:53.600541364Z" level=info msg="ImageCreate event name:\"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:53.602164 containerd[2775]: time="2025-05-27T18:08:53.602142298Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:53.602782 containerd[2775]: time="2025-05-27T18:08:53.602758984Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"67118217\" in 1.817839438s" May 27 18:08:53.602808 containerd[2775]: time="2025-05-27T18:08:53.602788864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:0a1b3d5412de2974bc057a3463a132f935c307bc06d5b990ad54031e1f5a351d\"" May 27 18:08:53.604312 containerd[2775]: time="2025-05-27T18:08:53.604293157Z" level=info msg="CreateContainer within sandbox \"42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 18:08:53.608625 containerd[2775]: time="2025-05-27T18:08:53.608601795Z" level=info msg="Container 4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:53.613051 containerd[2775]: time="2025-05-27T18:08:53.613018634Z" level=info msg="CreateContainer within sandbox \"42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d\"" May 27 18:08:53.613432 containerd[2775]: time="2025-05-27T18:08:53.613404717Z" level=info msg="StartContainer for \"4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d\"" May 27 18:08:53.616665 containerd[2775]: time="2025-05-27T18:08:53.616630386Z" level=info msg="connecting to shim 4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d" address="unix:///run/containerd/s/628188dd7d6a85bd1928c5471c419e3fbfee3c4b0c8fe2dd0746d9f3b9b7f9bd" protocol=ttrpc version=3 May 27 18:08:53.637696 systemd[1]: Started cri-containerd-4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d.scope - libcontainer container 4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d. May 27 18:08:53.665439 containerd[2775]: time="2025-05-27T18:08:53.665408615Z" level=info msg="StartContainer for \"4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d\" returns successfully" May 27 18:08:53.744070 kubelet[4358]: E0527 18:08:53.744032 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-v6s4f" podUID="6d206e1a-532c-4375-9afc-8ff133d58b02" May 27 18:08:54.039044 systemd[1]: cri-containerd-4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d.scope: Deactivated successfully. May 27 18:08:54.039340 systemd[1]: cri-containerd-4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d.scope: Consumed 875ms CPU time, 197.1M memory peak, 165.5M written to disk. May 27 18:08:54.039845 containerd[2775]: time="2025-05-27T18:08:54.039818654Z" level=info msg="received exit event container_id:\"4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d\" id:\"4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d\" pid:5449 exited_at:{seconds:1748369334 nanos:39689533}" May 27 18:08:54.039948 containerd[2775]: time="2025-05-27T18:08:54.039923775Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d\" id:\"4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d\" pid:5449 exited_at:{seconds:1748369334 nanos:39689533}" May 27 18:08:54.051850 kubelet[4358]: I0527 18:08:54.051831 4358 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 27 18:08:54.055084 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d-rootfs.mount: Deactivated successfully. May 27 18:08:54.069505 systemd[1]: Created slice kubepods-burstable-pod1afbaf81_9802_4311_a5e3_ec291af87c54.slice - libcontainer container kubepods-burstable-pod1afbaf81_9802_4311_a5e3_ec291af87c54.slice. May 27 18:08:54.074801 systemd[1]: Created slice kubepods-burstable-podb3875208_89b2_4c38_9972_ddcdb9a3d15d.slice - libcontainer container kubepods-burstable-podb3875208_89b2_4c38_9972_ddcdb9a3d15d.slice. May 27 18:08:54.079870 systemd[1]: Created slice kubepods-besteffort-pod9d777eb3_dc96_46f0_ba14_4404408d1bf1.slice - libcontainer container kubepods-besteffort-pod9d777eb3_dc96_46f0_ba14_4404408d1bf1.slice. May 27 18:08:54.091989 systemd[1]: Created slice kubepods-besteffort-pod9ea28421_4e19_420d_bfa8_98dc649e6584.slice - libcontainer container kubepods-besteffort-pod9ea28421_4e19_420d_bfa8_98dc649e6584.slice. May 27 18:08:54.098591 kubelet[4358]: I0527 18:08:54.098485 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hbp\" (UniqueName: \"kubernetes.io/projected/1afbaf81-9802-4311-a5e3-ec291af87c54-kube-api-access-29hbp\") pod \"coredns-7c65d6cfc9-hzcq6\" (UID: \"1afbaf81-9802-4311-a5e3-ec291af87c54\") " pod="kube-system/coredns-7c65d6cfc9-hzcq6" May 27 18:08:54.098591 kubelet[4358]: I0527 18:08:54.098530 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r56jl\" (UniqueName: \"kubernetes.io/projected/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-kube-api-access-r56jl\") pod \"whisker-84f8fc659b-7d7xn\" (UID: \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\") " pod="calico-system/whisker-84f8fc659b-7d7xn" May 27 18:08:54.098591 kubelet[4358]: I0527 18:08:54.098549 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb6xx\" (UniqueName: \"kubernetes.io/projected/b3875208-89b2-4c38-9972-ddcdb9a3d15d-kube-api-access-zb6xx\") pod \"coredns-7c65d6cfc9-bhcfn\" (UID: \"b3875208-89b2-4c38-9972-ddcdb9a3d15d\") " pod="kube-system/coredns-7c65d6cfc9-bhcfn" May 27 18:08:54.098591 kubelet[4358]: I0527 18:08:54.098566 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42660bd-427d-4774-a8fa-836372ea3d07-goldmane-ca-bundle\") pod \"goldmane-8f77d7b6c-qrwrj\" (UID: \"e42660bd-427d-4774-a8fa-836372ea3d07\") " pod="calico-system/goldmane-8f77d7b6c-qrwrj" May 27 18:08:54.098591 kubelet[4358]: I0527 18:08:54.098588 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4wx4\" (UniqueName: \"kubernetes.io/projected/e42660bd-427d-4774-a8fa-836372ea3d07-kube-api-access-x4wx4\") pod \"goldmane-8f77d7b6c-qrwrj\" (UID: \"e42660bd-427d-4774-a8fa-836372ea3d07\") " pod="calico-system/goldmane-8f77d7b6c-qrwrj" May 27 18:08:54.100053 kubelet[4358]: I0527 18:08:54.098603 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1afbaf81-9802-4311-a5e3-ec291af87c54-config-volume\") pod \"coredns-7c65d6cfc9-hzcq6\" (UID: \"1afbaf81-9802-4311-a5e3-ec291af87c54\") " pod="kube-system/coredns-7c65d6cfc9-hzcq6" May 27 18:08:54.100053 kubelet[4358]: I0527 18:08:54.098621 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/65af4fe4-da6d-442f-800a-28a5b897384f-calico-apiserver-certs\") pod \"calico-apiserver-858f7796ff-w84v7\" (UID: \"65af4fe4-da6d-442f-800a-28a5b897384f\") " pod="calico-apiserver/calico-apiserver-858f7796ff-w84v7" May 27 18:08:54.100053 kubelet[4358]: I0527 18:08:54.098640 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9ea28421-4e19-420d-bfa8-98dc649e6584-calico-apiserver-certs\") pod \"calico-apiserver-858f7796ff-d7gxq\" (UID: \"9ea28421-4e19-420d-bfa8-98dc649e6584\") " pod="calico-apiserver/calico-apiserver-858f7796ff-d7gxq" May 27 18:08:54.100053 kubelet[4358]: I0527 18:08:54.098657 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqrf\" (UniqueName: \"kubernetes.io/projected/9d777eb3-dc96-46f0-ba14-4404408d1bf1-kube-api-access-xcqrf\") pod \"calico-kube-controllers-6d6b48788b-blbsr\" (UID: \"9d777eb3-dc96-46f0-ba14-4404408d1bf1\") " pod="calico-system/calico-kube-controllers-6d6b48788b-blbsr" May 27 18:08:54.100053 kubelet[4358]: I0527 18:08:54.098674 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3875208-89b2-4c38-9972-ddcdb9a3d15d-config-volume\") pod \"coredns-7c65d6cfc9-bhcfn\" (UID: \"b3875208-89b2-4c38-9972-ddcdb9a3d15d\") " pod="kube-system/coredns-7c65d6cfc9-bhcfn" May 27 18:08:54.107884 kubelet[4358]: I0527 18:08:54.098690 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx2v9\" (UniqueName: \"kubernetes.io/projected/9ea28421-4e19-420d-bfa8-98dc649e6584-kube-api-access-gx2v9\") pod \"calico-apiserver-858f7796ff-d7gxq\" (UID: \"9ea28421-4e19-420d-bfa8-98dc649e6584\") " pod="calico-apiserver/calico-apiserver-858f7796ff-d7gxq" May 27 18:08:54.107884 kubelet[4358]: I0527 18:08:54.098707 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d777eb3-dc96-46f0-ba14-4404408d1bf1-tigera-ca-bundle\") pod \"calico-kube-controllers-6d6b48788b-blbsr\" (UID: \"9d777eb3-dc96-46f0-ba14-4404408d1bf1\") " pod="calico-system/calico-kube-controllers-6d6b48788b-blbsr" May 27 18:08:54.107884 kubelet[4358]: I0527 18:08:54.098722 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/e42660bd-427d-4774-a8fa-836372ea3d07-goldmane-key-pair\") pod \"goldmane-8f77d7b6c-qrwrj\" (UID: \"e42660bd-427d-4774-a8fa-836372ea3d07\") " pod="calico-system/goldmane-8f77d7b6c-qrwrj" May 27 18:08:54.107884 kubelet[4358]: I0527 18:08:54.098738 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gn4s\" (UniqueName: \"kubernetes.io/projected/65af4fe4-da6d-442f-800a-28a5b897384f-kube-api-access-7gn4s\") pod \"calico-apiserver-858f7796ff-w84v7\" (UID: \"65af4fe4-da6d-442f-800a-28a5b897384f\") " pod="calico-apiserver/calico-apiserver-858f7796ff-w84v7" May 27 18:08:54.107884 kubelet[4358]: I0527 18:08:54.098755 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-whisker-backend-key-pair\") pod \"whisker-84f8fc659b-7d7xn\" (UID: \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\") " pod="calico-system/whisker-84f8fc659b-7d7xn" May 27 18:08:54.100222 systemd[1]: Created slice kubepods-besteffort-pode42660bd_427d_4774_a8fa_836372ea3d07.slice - libcontainer container kubepods-besteffort-pode42660bd_427d_4774_a8fa_836372ea3d07.slice. May 27 18:08:54.108166 kubelet[4358]: I0527 18:08:54.098781 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e42660bd-427d-4774-a8fa-836372ea3d07-config\") pod \"goldmane-8f77d7b6c-qrwrj\" (UID: \"e42660bd-427d-4774-a8fa-836372ea3d07\") " pod="calico-system/goldmane-8f77d7b6c-qrwrj" May 27 18:08:54.108166 kubelet[4358]: I0527 18:08:54.098838 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-whisker-ca-bundle\") pod \"whisker-84f8fc659b-7d7xn\" (UID: \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\") " pod="calico-system/whisker-84f8fc659b-7d7xn" May 27 18:08:54.104120 systemd[1]: Created slice kubepods-besteffort-pod65af4fe4_da6d_442f_800a_28a5b897384f.slice - libcontainer container kubepods-besteffort-pod65af4fe4_da6d_442f_800a_28a5b897384f.slice. May 27 18:08:54.107819 systemd[1]: Created slice kubepods-besteffort-pod7e7254f9_cb88_4ff3_81e1_fe4eaea08651.slice - libcontainer container kubepods-besteffort-pod7e7254f9_cb88_4ff3_81e1_fe4eaea08651.slice. May 27 18:08:54.372662 containerd[2775]: time="2025-05-27T18:08:54.372630925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hzcq6,Uid:1afbaf81-9802-4311-a5e3-ec291af87c54,Namespace:kube-system,Attempt:0,}" May 27 18:08:54.378123 containerd[2775]: time="2025-05-27T18:08:54.378093571Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-bhcfn,Uid:b3875208-89b2-4c38-9972-ddcdb9a3d15d,Namespace:kube-system,Attempt:0,}" May 27 18:08:54.386634 containerd[2775]: time="2025-05-27T18:08:54.386605842Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d6b48788b-blbsr,Uid:9d777eb3-dc96-46f0-ba14-4404408d1bf1,Namespace:calico-system,Attempt:0,}" May 27 18:08:54.397222 containerd[2775]: time="2025-05-27T18:08:54.397196051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-858f7796ff-d7gxq,Uid:9ea28421-4e19-420d-bfa8-98dc649e6584,Namespace:calico-apiserver,Attempt:0,}" May 27 18:08:54.402869 containerd[2775]: time="2025-05-27T18:08:54.402841019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-qrwrj,Uid:e42660bd-427d-4774-a8fa-836372ea3d07,Namespace:calico-system,Attempt:0,}" May 27 18:08:54.406307 containerd[2775]: time="2025-05-27T18:08:54.406284608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-858f7796ff-w84v7,Uid:65af4fe4-da6d-442f-800a-28a5b897384f,Namespace:calico-apiserver,Attempt:0,}" May 27 18:08:54.409871 containerd[2775]: time="2025-05-27T18:08:54.409842437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84f8fc659b-7d7xn,Uid:7e7254f9-cb88-4ff3-81e1-fe4eaea08651,Namespace:calico-system,Attempt:0,}" May 27 18:08:54.430038 containerd[2775]: time="2025-05-27T18:08:54.429992526Z" level=error msg="Failed to destroy network for sandbox \"d1d5605399a9192d57387a9d7ee9a5c0e49db48141de00fc3b6dab6f5fcbac45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.430680 containerd[2775]: time="2025-05-27T18:08:54.430433050Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-bhcfn,Uid:b3875208-89b2-4c38-9972-ddcdb9a3d15d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1d5605399a9192d57387a9d7ee9a5c0e49db48141de00fc3b6dab6f5fcbac45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.430680 containerd[2775]: time="2025-05-27T18:08:54.430452170Z" level=error msg="Failed to destroy network for sandbox \"3bc2365ea4b5835bd496319852445c1282c8569e551327e4cb1b48959198cbf6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.430846 containerd[2775]: time="2025-05-27T18:08:54.430808453Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hzcq6,Uid:1afbaf81-9802-4311-a5e3-ec291af87c54,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bc2365ea4b5835bd496319852445c1282c8569e551327e4cb1b48959198cbf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.430887 kubelet[4358]: E0527 18:08:54.430674 4358 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1d5605399a9192d57387a9d7ee9a5c0e49db48141de00fc3b6dab6f5fcbac45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.430887 kubelet[4358]: E0527 18:08:54.430783 4358 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1d5605399a9192d57387a9d7ee9a5c0e49db48141de00fc3b6dab6f5fcbac45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-bhcfn" May 27 18:08:54.430887 kubelet[4358]: E0527 18:08:54.430802 4358 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1d5605399a9192d57387a9d7ee9a5c0e49db48141de00fc3b6dab6f5fcbac45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-bhcfn" May 27 18:08:54.430982 kubelet[4358]: E0527 18:08:54.430843 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-bhcfn_kube-system(b3875208-89b2-4c38-9972-ddcdb9a3d15d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-bhcfn_kube-system(b3875208-89b2-4c38-9972-ddcdb9a3d15d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1d5605399a9192d57387a9d7ee9a5c0e49db48141de00fc3b6dab6f5fcbac45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-bhcfn" podUID="b3875208-89b2-4c38-9972-ddcdb9a3d15d" May 27 18:08:54.430982 kubelet[4358]: E0527 18:08:54.430934 4358 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bc2365ea4b5835bd496319852445c1282c8569e551327e4cb1b48959198cbf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.430982 kubelet[4358]: E0527 18:08:54.430975 4358 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bc2365ea4b5835bd496319852445c1282c8569e551327e4cb1b48959198cbf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hzcq6" May 27 18:08:54.431061 kubelet[4358]: E0527 18:08:54.430992 4358 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3bc2365ea4b5835bd496319852445c1282c8569e551327e4cb1b48959198cbf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hzcq6" May 27 18:08:54.431061 kubelet[4358]: E0527 18:08:54.431030 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-hzcq6_kube-system(1afbaf81-9802-4311-a5e3-ec291af87c54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-hzcq6_kube-system(1afbaf81-9802-4311-a5e3-ec291af87c54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3bc2365ea4b5835bd496319852445c1282c8569e551327e4cb1b48959198cbf6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-hzcq6" podUID="1afbaf81-9802-4311-a5e3-ec291af87c54" May 27 18:08:54.431152 containerd[2775]: time="2025-05-27T18:08:54.431115336Z" level=error msg="Failed to destroy network for sandbox \"3a49ff7f8a4fe519cf3986f189f70238981e2208991f8e9f47e540070571f14e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.431478 containerd[2775]: time="2025-05-27T18:08:54.431450739Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d6b48788b-blbsr,Uid:9d777eb3-dc96-46f0-ba14-4404408d1bf1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a49ff7f8a4fe519cf3986f189f70238981e2208991f8e9f47e540070571f14e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.431623 kubelet[4358]: E0527 18:08:54.431596 4358 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a49ff7f8a4fe519cf3986f189f70238981e2208991f8e9f47e540070571f14e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.431658 kubelet[4358]: E0527 18:08:54.431636 4358 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a49ff7f8a4fe519cf3986f189f70238981e2208991f8e9f47e540070571f14e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6d6b48788b-blbsr" May 27 18:08:54.431688 kubelet[4358]: E0527 18:08:54.431655 4358 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a49ff7f8a4fe519cf3986f189f70238981e2208991f8e9f47e540070571f14e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6d6b48788b-blbsr" May 27 18:08:54.431712 kubelet[4358]: E0527 18:08:54.431687 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6d6b48788b-blbsr_calico-system(9d777eb3-dc96-46f0-ba14-4404408d1bf1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6d6b48788b-blbsr_calico-system(9d777eb3-dc96-46f0-ba14-4404408d1bf1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a49ff7f8a4fe519cf3986f189f70238981e2208991f8e9f47e540070571f14e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6d6b48788b-blbsr" podUID="9d777eb3-dc96-46f0-ba14-4404408d1bf1" May 27 18:08:54.446972 containerd[2775]: time="2025-05-27T18:08:54.446930868Z" level=error msg="Failed to destroy network for sandbox \"4c37f676a0cab0a231ca570aab502c7d14931df659ab79457612c0f03da5a003\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.447406 containerd[2775]: time="2025-05-27T18:08:54.447371712Z" level=error msg="Failed to destroy network for sandbox \"2cad9422aa5932f97d41abbf85cd067c97c31e4e578d22dce2e232003574ecd0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.447472 containerd[2775]: time="2025-05-27T18:08:54.447370832Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-858f7796ff-w84v7,Uid:65af4fe4-da6d-442f-800a-28a5b897384f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c37f676a0cab0a231ca570aab502c7d14931df659ab79457612c0f03da5a003\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.447665 kubelet[4358]: E0527 18:08:54.447629 4358 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c37f676a0cab0a231ca570aab502c7d14931df659ab79457612c0f03da5a003\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.447710 kubelet[4358]: E0527 18:08:54.447688 4358 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c37f676a0cab0a231ca570aab502c7d14931df659ab79457612c0f03da5a003\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-858f7796ff-w84v7" May 27 18:08:54.447736 containerd[2775]: time="2025-05-27T18:08:54.447678755Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-858f7796ff-d7gxq,Uid:9ea28421-4e19-420d-bfa8-98dc649e6584,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cad9422aa5932f97d41abbf85cd067c97c31e4e578d22dce2e232003574ecd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.447772 kubelet[4358]: E0527 18:08:54.447707 4358 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c37f676a0cab0a231ca570aab502c7d14931df659ab79457612c0f03da5a003\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-858f7796ff-w84v7" May 27 18:08:54.447772 kubelet[4358]: E0527 18:08:54.447754 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-858f7796ff-w84v7_calico-apiserver(65af4fe4-da6d-442f-800a-28a5b897384f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-858f7796ff-w84v7_calico-apiserver(65af4fe4-da6d-442f-800a-28a5b897384f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c37f676a0cab0a231ca570aab502c7d14931df659ab79457612c0f03da5a003\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-858f7796ff-w84v7" podUID="65af4fe4-da6d-442f-800a-28a5b897384f" May 27 18:08:54.447833 kubelet[4358]: E0527 18:08:54.447793 4358 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cad9422aa5932f97d41abbf85cd067c97c31e4e578d22dce2e232003574ecd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.447856 kubelet[4358]: E0527 18:08:54.447832 4358 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cad9422aa5932f97d41abbf85cd067c97c31e4e578d22dce2e232003574ecd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-858f7796ff-d7gxq" May 27 18:08:54.447856 kubelet[4358]: E0527 18:08:54.447848 4358 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cad9422aa5932f97d41abbf85cd067c97c31e4e578d22dce2e232003574ecd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-858f7796ff-d7gxq" May 27 18:08:54.447900 kubelet[4358]: E0527 18:08:54.447879 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-858f7796ff-d7gxq_calico-apiserver(9ea28421-4e19-420d-bfa8-98dc649e6584)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-858f7796ff-d7gxq_calico-apiserver(9ea28421-4e19-420d-bfa8-98dc649e6584)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2cad9422aa5932f97d41abbf85cd067c97c31e4e578d22dce2e232003574ecd0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-858f7796ff-d7gxq" podUID="9ea28421-4e19-420d-bfa8-98dc649e6584" May 27 18:08:54.447950 containerd[2775]: time="2025-05-27T18:08:54.447919917Z" level=error msg="Failed to destroy network for sandbox \"fd5051a892a1d449b2a674bdb55777f0e59f300cf63e6f41741a947405bbaac5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.448246 containerd[2775]: time="2025-05-27T18:08:54.448222479Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-qrwrj,Uid:e42660bd-427d-4774-a8fa-836372ea3d07,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd5051a892a1d449b2a674bdb55777f0e59f300cf63e6f41741a947405bbaac5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.448374 kubelet[4358]: E0527 18:08:54.448354 4358 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd5051a892a1d449b2a674bdb55777f0e59f300cf63e6f41741a947405bbaac5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.448401 kubelet[4358]: E0527 18:08:54.448387 4358 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd5051a892a1d449b2a674bdb55777f0e59f300cf63e6f41741a947405bbaac5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-qrwrj" May 27 18:08:54.448428 kubelet[4358]: E0527 18:08:54.448403 4358 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd5051a892a1d449b2a674bdb55777f0e59f300cf63e6f41741a947405bbaac5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-qrwrj" May 27 18:08:54.448449 kubelet[4358]: E0527 18:08:54.448433 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-8f77d7b6c-qrwrj_calico-system(e42660bd-427d-4774-a8fa-836372ea3d07)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-8f77d7b6c-qrwrj_calico-system(e42660bd-427d-4774-a8fa-836372ea3d07)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fd5051a892a1d449b2a674bdb55777f0e59f300cf63e6f41741a947405bbaac5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:08:54.450329 containerd[2775]: time="2025-05-27T18:08:54.450307217Z" level=error msg="Failed to destroy network for sandbox \"afb0ca361c155cdf6bd01c440ef35373464061a32cd3c1c1a915207e6a8093ac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.450624 containerd[2775]: time="2025-05-27T18:08:54.450599699Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-84f8fc659b-7d7xn,Uid:7e7254f9-cb88-4ff3-81e1-fe4eaea08651,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"afb0ca361c155cdf6bd01c440ef35373464061a32cd3c1c1a915207e6a8093ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.450752 kubelet[4358]: E0527 18:08:54.450729 4358 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afb0ca361c155cdf6bd01c440ef35373464061a32cd3c1c1a915207e6a8093ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:54.450786 kubelet[4358]: E0527 18:08:54.450766 4358 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afb0ca361c155cdf6bd01c440ef35373464061a32cd3c1c1a915207e6a8093ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84f8fc659b-7d7xn" May 27 18:08:54.450808 kubelet[4358]: E0527 18:08:54.450783 4358 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afb0ca361c155cdf6bd01c440ef35373464061a32cd3c1c1a915207e6a8093ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-84f8fc659b-7d7xn" May 27 18:08:54.450830 kubelet[4358]: E0527 18:08:54.450815 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-84f8fc659b-7d7xn_calico-system(7e7254f9-cb88-4ff3-81e1-fe4eaea08651)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-84f8fc659b-7d7xn_calico-system(7e7254f9-cb88-4ff3-81e1-fe4eaea08651)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"afb0ca361c155cdf6bd01c440ef35373464061a32cd3c1c1a915207e6a8093ac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-84f8fc659b-7d7xn" podUID="7e7254f9-cb88-4ff3-81e1-fe4eaea08651" May 27 18:08:54.794664 containerd[2775]: time="2025-05-27T18:08:54.794551784Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 18:08:55.749398 systemd[1]: Created slice kubepods-besteffort-pod6d206e1a_532c_4375_9afc_8ff133d58b02.slice - libcontainer container kubepods-besteffort-pod6d206e1a_532c_4375_9afc_8ff133d58b02.slice. May 27 18:08:55.760668 containerd[2775]: time="2025-05-27T18:08:55.760640712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-v6s4f,Uid:6d206e1a-532c-4375-9afc-8ff133d58b02,Namespace:calico-system,Attempt:0,}" May 27 18:08:55.806997 containerd[2775]: time="2025-05-27T18:08:55.806959442Z" level=error msg="Failed to destroy network for sandbox \"622c3303a34f9e89d149955956bf49295c454101868a6d4145d6e3a6da745a57\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:55.807564 containerd[2775]: time="2025-05-27T18:08:55.807534207Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-v6s4f,Uid:6d206e1a-532c-4375-9afc-8ff133d58b02,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"622c3303a34f9e89d149955956bf49295c454101868a6d4145d6e3a6da745a57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:55.807704 kubelet[4358]: E0527 18:08:55.807676 4358 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"622c3303a34f9e89d149955956bf49295c454101868a6d4145d6e3a6da745a57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:08:55.807890 kubelet[4358]: E0527 18:08:55.807726 4358 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"622c3303a34f9e89d149955956bf49295c454101868a6d4145d6e3a6da745a57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-v6s4f" May 27 18:08:55.807890 kubelet[4358]: E0527 18:08:55.807743 4358 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"622c3303a34f9e89d149955956bf49295c454101868a6d4145d6e3a6da745a57\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-v6s4f" May 27 18:08:55.807890 kubelet[4358]: E0527 18:08:55.807780 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-v6s4f_calico-system(6d206e1a-532c-4375-9afc-8ff133d58b02)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-v6s4f_calico-system(6d206e1a-532c-4375-9afc-8ff133d58b02)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"622c3303a34f9e89d149955956bf49295c454101868a6d4145d6e3a6da745a57\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-v6s4f" podUID="6d206e1a-532c-4375-9afc-8ff133d58b02" May 27 18:08:55.808625 systemd[1]: run-netns-cni\x2d81070f4c\x2dbffe\x2da09b\x2d4536\x2d7ceb451d02d3.mount: Deactivated successfully. May 27 18:08:57.392675 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2373928431.mount: Deactivated successfully. May 27 18:08:57.415409 containerd[2775]: time="2025-05-27T18:08:57.415345092Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:57.415726 containerd[2775]: time="2025-05-27T18:08:57.415362092Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=150465379" May 27 18:08:57.416409 containerd[2775]: time="2025-05-27T18:08:57.415921056Z" level=info msg="ImageCreate event name:\"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:57.417317 containerd[2775]: time="2025-05-27T18:08:57.417255066Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:08:57.417959 containerd[2775]: time="2025-05-27T18:08:57.417902471Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"150465241\" in 2.623319167s" May 27 18:08:57.417959 containerd[2775]: time="2025-05-27T18:08:57.417933471Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:f7148fde8e28b27da58f84cac134cdc53b5df321cda13c660192f06839670732\"" May 27 18:08:57.423879 containerd[2775]: time="2025-05-27T18:08:57.423853754Z" level=info msg="CreateContainer within sandbox \"42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 18:08:57.431982 containerd[2775]: time="2025-05-27T18:08:57.431954773Z" level=info msg="Container 79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8: CDI devices from CRI Config.CDIDevices: []" May 27 18:08:57.437278 containerd[2775]: time="2025-05-27T18:08:57.437239532Z" level=info msg="CreateContainer within sandbox \"42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\"" May 27 18:08:57.437643 containerd[2775]: time="2025-05-27T18:08:57.437587094Z" level=info msg="StartContainer for \"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\"" May 27 18:08:57.438962 containerd[2775]: time="2025-05-27T18:08:57.438918904Z" level=info msg="connecting to shim 79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8" address="unix:///run/containerd/s/628188dd7d6a85bd1928c5471c419e3fbfee3c4b0c8fe2dd0746d9f3b9b7f9bd" protocol=ttrpc version=3 May 27 18:08:57.465700 systemd[1]: Started cri-containerd-79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8.scope - libcontainer container 79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8. May 27 18:08:57.496175 containerd[2775]: time="2025-05-27T18:08:57.496144001Z" level=info msg="StartContainer for \"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" returns successfully" May 27 18:08:57.625848 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 18:08:57.625962 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 18:08:57.714690 kubelet[4358]: I0527 18:08:57.714597 4358 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-whisker-backend-key-pair\") pod \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\" (UID: \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\") " May 27 18:08:57.714690 kubelet[4358]: I0527 18:08:57.714635 4358 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-whisker-ca-bundle\") pod \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\" (UID: \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\") " May 27 18:08:57.714690 kubelet[4358]: I0527 18:08:57.714654 4358 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r56jl\" (UniqueName: \"kubernetes.io/projected/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-kube-api-access-r56jl\") pod \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\" (UID: \"7e7254f9-cb88-4ff3-81e1-fe4eaea08651\") " May 27 18:08:57.715031 kubelet[4358]: I0527 18:08:57.714987 4358 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "7e7254f9-cb88-4ff3-81e1-fe4eaea08651" (UID: "7e7254f9-cb88-4ff3-81e1-fe4eaea08651"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 27 18:08:57.716903 kubelet[4358]: I0527 18:08:57.716877 4358 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-kube-api-access-r56jl" (OuterVolumeSpecName: "kube-api-access-r56jl") pod "7e7254f9-cb88-4ff3-81e1-fe4eaea08651" (UID: "7e7254f9-cb88-4ff3-81e1-fe4eaea08651"). InnerVolumeSpecName "kube-api-access-r56jl". PluginName "kubernetes.io/projected", VolumeGidValue "" May 27 18:08:57.716955 kubelet[4358]: I0527 18:08:57.716929 4358 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "7e7254f9-cb88-4ff3-81e1-fe4eaea08651" (UID: "7e7254f9-cb88-4ff3-81e1-fe4eaea08651"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" May 27 18:08:57.749945 systemd[1]: Removed slice kubepods-besteffort-pod7e7254f9_cb88_4ff3_81e1_fe4eaea08651.slice - libcontainer container kubepods-besteffort-pod7e7254f9_cb88_4ff3_81e1_fe4eaea08651.slice. May 27 18:08:57.815160 kubelet[4358]: I0527 18:08:57.815128 4358 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-whisker-backend-key-pair\") on node \"ci-4344.0.0-a-e5d745d36c\" DevicePath \"\"" May 27 18:08:57.815160 kubelet[4358]: I0527 18:08:57.815154 4358 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-whisker-ca-bundle\") on node \"ci-4344.0.0-a-e5d745d36c\" DevicePath \"\"" May 27 18:08:57.815160 kubelet[4358]: I0527 18:08:57.815164 4358 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r56jl\" (UniqueName: \"kubernetes.io/projected/7e7254f9-cb88-4ff3-81e1-fe4eaea08651-kube-api-access-r56jl\") on node \"ci-4344.0.0-a-e5d745d36c\" DevicePath \"\"" May 27 18:08:57.823202 kubelet[4358]: I0527 18:08:57.823157 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-jwzbk" podStartSLOduration=1.493998851 podStartE2EDuration="8.823142827s" podCreationTimestamp="2025-05-27 18:08:49 +0000 UTC" firstStartedPulling="2025-05-27 18:08:50.089323499 +0000 UTC m=+20.415450421" lastFinishedPulling="2025-05-27 18:08:57.418467515 +0000 UTC m=+27.744594397" observedRunningTime="2025-05-27 18:08:57.822931185 +0000 UTC m=+28.149058147" watchObservedRunningTime="2025-05-27 18:08:57.823142827 +0000 UTC m=+28.149269749" May 27 18:08:57.828541 systemd[1]: Created slice kubepods-besteffort-pod7b02d917_929c_457a_9f01_27a3d9745228.slice - libcontainer container kubepods-besteffort-pod7b02d917_929c_457a_9f01_27a3d9745228.slice. May 27 18:08:57.915721 kubelet[4358]: I0527 18:08:57.915688 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7b02d917-929c-457a-9f01-27a3d9745228-whisker-backend-key-pair\") pod \"whisker-5d46c699d6-jdjnd\" (UID: \"7b02d917-929c-457a-9f01-27a3d9745228\") " pod="calico-system/whisker-5d46c699d6-jdjnd" May 27 18:08:57.915785 kubelet[4358]: I0527 18:08:57.915727 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwk6l\" (UniqueName: \"kubernetes.io/projected/7b02d917-929c-457a-9f01-27a3d9745228-kube-api-access-zwk6l\") pod \"whisker-5d46c699d6-jdjnd\" (UID: \"7b02d917-929c-457a-9f01-27a3d9745228\") " pod="calico-system/whisker-5d46c699d6-jdjnd" May 27 18:08:57.915785 kubelet[4358]: I0527 18:08:57.915748 4358 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b02d917-929c-457a-9f01-27a3d9745228-whisker-ca-bundle\") pod \"whisker-5d46c699d6-jdjnd\" (UID: \"7b02d917-929c-457a-9f01-27a3d9745228\") " pod="calico-system/whisker-5d46c699d6-jdjnd" May 27 18:08:58.131475 containerd[2775]: time="2025-05-27T18:08:58.131429594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d46c699d6-jdjnd,Uid:7b02d917-929c-457a-9f01-27a3d9745228,Namespace:calico-system,Attempt:0,}" May 27 18:08:58.234952 systemd-networkd[2686]: caliaec66ce95ee: Link UP May 27 18:08:58.235141 systemd-networkd[2686]: caliaec66ce95ee: Gained carrier May 27 18:08:58.243652 containerd[2775]: 2025-05-27 18:08:58.149 [INFO][6035] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:08:58.243652 containerd[2775]: 2025-05-27 18:08:58.164 [INFO][6035] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0 whisker-5d46c699d6- calico-system 7b02d917-929c-457a-9f01-27a3d9745228 837 0 2025-05-27 18:08:57 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5d46c699d6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344.0.0-a-e5d745d36c whisker-5d46c699d6-jdjnd eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliaec66ce95ee [] [] }} ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Namespace="calico-system" Pod="whisker-5d46c699d6-jdjnd" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-" May 27 18:08:58.243652 containerd[2775]: 2025-05-27 18:08:58.164 [INFO][6035] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Namespace="calico-system" Pod="whisker-5d46c699d6-jdjnd" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" May 27 18:08:58.243652 containerd[2775]: 2025-05-27 18:08:58.200 [INFO][6058] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" HandleID="k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Workload="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.200 [INFO][6058] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" HandleID="k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Workload="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400070e800), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-e5d745d36c", "pod":"whisker-5d46c699d6-jdjnd", "timestamp":"2025-05-27 18:08:58.200177793 +0000 UTC"}, Hostname:"ci-4344.0.0-a-e5d745d36c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.200 [INFO][6058] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.200 [INFO][6058] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.200 [INFO][6058] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-e5d745d36c' May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.208 [INFO][6058] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.212 [INFO][6058] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.214 [INFO][6058] ipam/ipam.go 511: Trying affinity for 192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.216 [INFO][6058] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.243893 containerd[2775]: 2025-05-27 18:08:58.217 [INFO][6058] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.244072 containerd[2775]: 2025-05-27 18:08:58.217 [INFO][6058] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.32.128/26 handle="k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.244072 containerd[2775]: 2025-05-27 18:08:58.218 [INFO][6058] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3 May 27 18:08:58.244072 containerd[2775]: 2025-05-27 18:08:58.222 [INFO][6058] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.32.128/26 handle="k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.244072 containerd[2775]: 2025-05-27 18:08:58.225 [INFO][6058] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.32.129/26] block=192.168.32.128/26 handle="k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.244072 containerd[2775]: 2025-05-27 18:08:58.226 [INFO][6058] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.129/26] handle="k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" host="ci-4344.0.0-a-e5d745d36c" May 27 18:08:58.244072 containerd[2775]: 2025-05-27 18:08:58.226 [INFO][6058] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:08:58.244072 containerd[2775]: 2025-05-27 18:08:58.226 [INFO][6058] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.32.129/26] IPv6=[] ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" HandleID="k8s-pod-network.2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Workload="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" May 27 18:08:58.244192 containerd[2775]: 2025-05-27 18:08:58.228 [INFO][6035] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Namespace="calico-system" Pod="whisker-5d46c699d6-jdjnd" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0", GenerateName:"whisker-5d46c699d6-", Namespace:"calico-system", SelfLink:"", UID:"7b02d917-929c-457a-9f01-27a3d9745228", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5d46c699d6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"", Pod:"whisker-5d46c699d6-jdjnd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.32.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaec66ce95ee", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:08:58.244192 containerd[2775]: 2025-05-27 18:08:58.228 [INFO][6035] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.129/32] ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Namespace="calico-system" Pod="whisker-5d46c699d6-jdjnd" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" May 27 18:08:58.244255 containerd[2775]: 2025-05-27 18:08:58.228 [INFO][6035] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaec66ce95ee ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Namespace="calico-system" Pod="whisker-5d46c699d6-jdjnd" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" May 27 18:08:58.244255 containerd[2775]: 2025-05-27 18:08:58.235 [INFO][6035] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Namespace="calico-system" Pod="whisker-5d46c699d6-jdjnd" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" May 27 18:08:58.244337 containerd[2775]: 2025-05-27 18:08:58.236 [INFO][6035] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Namespace="calico-system" Pod="whisker-5d46c699d6-jdjnd" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0", GenerateName:"whisker-5d46c699d6-", Namespace:"calico-system", SelfLink:"", UID:"7b02d917-929c-457a-9f01-27a3d9745228", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5d46c699d6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3", Pod:"whisker-5d46c699d6-jdjnd", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.32.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaec66ce95ee", MAC:"0e:d3:48:96:ce:a8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:08:58.244387 containerd[2775]: 2025-05-27 18:08:58.242 [INFO][6035] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" Namespace="calico-system" Pod="whisker-5d46c699d6-jdjnd" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-whisker--5d46c699d6--jdjnd-eth0" May 27 18:08:58.253893 containerd[2775]: time="2025-05-27T18:08:58.253859448Z" level=info msg="connecting to shim 2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3" address="unix:///run/containerd/s/b542d33c9149e7dda6af2b238d26f7bff4eee37577c98e61d3ebe3db3edfd800" namespace=k8s.io protocol=ttrpc version=3 May 27 18:08:58.283702 systemd[1]: Started cri-containerd-2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3.scope - libcontainer container 2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3. May 27 18:08:58.310366 containerd[2775]: time="2025-05-27T18:08:58.310332642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d46c699d6-jdjnd,Uid:7b02d917-929c-457a-9f01-27a3d9745228,Namespace:calico-system,Attempt:0,} returns sandbox id \"2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3\"" May 27 18:08:58.311445 containerd[2775]: time="2025-05-27T18:08:58.311422609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:08:58.340482 containerd[2775]: time="2025-05-27T18:08:58.340443692Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:08:58.340715 containerd[2775]: time="2025-05-27T18:08:58.340683813Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:08:58.340747 containerd[2775]: time="2025-05-27T18:08:58.340726734Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:08:58.340888 kubelet[4358]: E0527 18:08:58.340846 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:08:58.340954 kubelet[4358]: E0527 18:08:58.340904 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:08:58.341039 kubelet[4358]: E0527 18:08:58.341009 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:7372ac9998994bc2afef8b2a28678866,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:08:58.342589 containerd[2775]: time="2025-05-27T18:08:58.342566226Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:08:58.365714 containerd[2775]: time="2025-05-27T18:08:58.365657427Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:08:58.367544 containerd[2775]: time="2025-05-27T18:08:58.367507200Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:08:58.367658 containerd[2775]: time="2025-05-27T18:08:58.367561881Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:08:58.367742 kubelet[4358]: E0527 18:08:58.367701 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:08:58.367781 kubelet[4358]: E0527 18:08:58.367753 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:08:58.367924 kubelet[4358]: E0527 18:08:58.367869 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:08:58.369068 kubelet[4358]: E0527 18:08:58.369038 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:08:58.396681 systemd[1]: var-lib-kubelet-pods-7e7254f9\x2dcb88\x2d4ff3\x2d81e1\x2dfe4eaea08651-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dr56jl.mount: Deactivated successfully. May 27 18:08:58.396759 systemd[1]: var-lib-kubelet-pods-7e7254f9\x2dcb88\x2d4ff3\x2d81e1\x2dfe4eaea08651-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 18:08:58.803217 kubelet[4358]: I0527 18:08:58.803123 4358 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:08:58.804098 kubelet[4358]: E0527 18:08:58.804062 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:08:59.313703 systemd-networkd[2686]: caliaec66ce95ee: Gained IPv6LL May 27 18:08:59.746711 kubelet[4358]: I0527 18:08:59.746680 4358 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7254f9-cb88-4ff3-81e1-fe4eaea08651" path="/var/lib/kubelet/pods/7e7254f9-cb88-4ff3-81e1-fe4eaea08651/volumes" May 27 18:08:59.804958 kubelet[4358]: E0527 18:08:59.804930 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:09:05.745654 containerd[2775]: time="2025-05-27T18:09:05.745571925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-qrwrj,Uid:e42660bd-427d-4774-a8fa-836372ea3d07,Namespace:calico-system,Attempt:0,}" May 27 18:09:05.745654 containerd[2775]: time="2025-05-27T18:09:05.745571605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-858f7796ff-w84v7,Uid:65af4fe4-da6d-442f-800a-28a5b897384f,Namespace:calico-apiserver,Attempt:0,}" May 27 18:09:05.841342 systemd-networkd[2686]: calia042b7606da: Link UP May 27 18:09:05.841550 systemd-networkd[2686]: calia042b7606da: Gained carrier May 27 18:09:05.850671 containerd[2775]: 2025-05-27 18:09:05.764 [INFO][6621] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:09:05.850671 containerd[2775]: 2025-05-27 18:09:05.778 [INFO][6621] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0 calico-apiserver-858f7796ff- calico-apiserver 65af4fe4-da6d-442f-800a-28a5b897384f 781 0 2025-05-27 18:08:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:858f7796ff projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-a-e5d745d36c calico-apiserver-858f7796ff-w84v7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia042b7606da [] [] }} ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-w84v7" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-" May 27 18:09:05.850671 containerd[2775]: 2025-05-27 18:09:05.778 [INFO][6621] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-w84v7" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" May 27 18:09:05.850671 containerd[2775]: 2025-05-27 18:09:05.798 [INFO][6671] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" HandleID="k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.798 [INFO][6671] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" HandleID="k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40007782b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-a-e5d745d36c", "pod":"calico-apiserver-858f7796ff-w84v7", "timestamp":"2025-05-27 18:09:05.79817116 +0000 UTC"}, Hostname:"ci-4344.0.0-a-e5d745d36c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.798 [INFO][6671] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.798 [INFO][6671] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.798 [INFO][6671] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-e5d745d36c' May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.806 [INFO][6671] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.810 [INFO][6671] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.815 [INFO][6671] ipam/ipam.go 511: Trying affinity for 192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.816 [INFO][6671] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.850869 containerd[2775]: 2025-05-27 18:09:05.818 [INFO][6671] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.851047 containerd[2775]: 2025-05-27 18:09:05.818 [INFO][6671] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.32.128/26 handle="k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.851047 containerd[2775]: 2025-05-27 18:09:05.819 [INFO][6671] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5 May 27 18:09:05.851047 containerd[2775]: 2025-05-27 18:09:05.834 [INFO][6671] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.32.128/26 handle="k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.851047 containerd[2775]: 2025-05-27 18:09:05.838 [INFO][6671] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.32.130/26] block=192.168.32.128/26 handle="k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.851047 containerd[2775]: 2025-05-27 18:09:05.838 [INFO][6671] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.130/26] handle="k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.851047 containerd[2775]: 2025-05-27 18:09:05.838 [INFO][6671] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:09:05.851047 containerd[2775]: 2025-05-27 18:09:05.838 [INFO][6671] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.32.130/26] IPv6=[] ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" HandleID="k8s-pod-network.36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" May 27 18:09:05.851176 containerd[2775]: 2025-05-27 18:09:05.839 [INFO][6621] cni-plugin/k8s.go 418: Populated endpoint ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-w84v7" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0", GenerateName:"calico-apiserver-858f7796ff-", Namespace:"calico-apiserver", SelfLink:"", UID:"65af4fe4-da6d-442f-800a-28a5b897384f", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"858f7796ff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"", Pod:"calico-apiserver-858f7796ff-w84v7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.32.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia042b7606da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:05.851222 containerd[2775]: 2025-05-27 18:09:05.840 [INFO][6621] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.130/32] ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-w84v7" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" May 27 18:09:05.851222 containerd[2775]: 2025-05-27 18:09:05.840 [INFO][6621] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia042b7606da ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-w84v7" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" May 27 18:09:05.851222 containerd[2775]: 2025-05-27 18:09:05.841 [INFO][6621] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-w84v7" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" May 27 18:09:05.851279 containerd[2775]: 2025-05-27 18:09:05.841 [INFO][6621] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-w84v7" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0", GenerateName:"calico-apiserver-858f7796ff-", Namespace:"calico-apiserver", SelfLink:"", UID:"65af4fe4-da6d-442f-800a-28a5b897384f", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"858f7796ff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5", Pod:"calico-apiserver-858f7796ff-w84v7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.32.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia042b7606da", MAC:"aa:6b:5a:b2:4d:24", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:05.851324 containerd[2775]: 2025-05-27 18:09:05.848 [INFO][6621] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-w84v7" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--w84v7-eth0" May 27 18:09:05.873536 containerd[2775]: time="2025-05-27T18:09:05.873506794Z" level=info msg="connecting to shim 36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5" address="unix:///run/containerd/s/1b35c03b5856df313a02d3d16deace72a80d69b34fb13741f9f37a37783071f7" namespace=k8s.io protocol=ttrpc version=3 May 27 18:09:05.903754 systemd[1]: Started cri-containerd-36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5.scope - libcontainer container 36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5. May 27 18:09:05.929977 containerd[2775]: time="2025-05-27T18:09:05.929948969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-858f7796ff-w84v7,Uid:65af4fe4-da6d-442f-800a-28a5b897384f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5\"" May 27 18:09:05.931010 containerd[2775]: time="2025-05-27T18:09:05.930976375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 18:09:05.932097 systemd-networkd[2686]: cali995577f62da: Link UP May 27 18:09:05.932302 systemd-networkd[2686]: cali995577f62da: Gained carrier May 27 18:09:05.939695 containerd[2775]: 2025-05-27 18:09:05.763 [INFO][6622] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:09:05.939695 containerd[2775]: 2025-05-27 18:09:05.779 [INFO][6622] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0 goldmane-8f77d7b6c- calico-system e42660bd-427d-4774-a8fa-836372ea3d07 777 0 2025-05-27 18:08:50 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:8f77d7b6c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344.0.0-a-e5d745d36c goldmane-8f77d7b6c-qrwrj eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali995577f62da [] [] }} ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Namespace="calico-system" Pod="goldmane-8f77d7b6c-qrwrj" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-" May 27 18:09:05.939695 containerd[2775]: 2025-05-27 18:09:05.779 [INFO][6622] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Namespace="calico-system" Pod="goldmane-8f77d7b6c-qrwrj" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" May 27 18:09:05.939695 containerd[2775]: 2025-05-27 18:09:05.799 [INFO][6673] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" HandleID="k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Workload="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.799 [INFO][6673] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" HandleID="k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Workload="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40007887c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-e5d745d36c", "pod":"goldmane-8f77d7b6c-qrwrj", "timestamp":"2025-05-27 18:09:05.799018924 +0000 UTC"}, Hostname:"ci-4344.0.0-a-e5d745d36c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.799 [INFO][6673] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.838 [INFO][6673] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.838 [INFO][6673] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-e5d745d36c' May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.907 [INFO][6673] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.911 [INFO][6673] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.917 [INFO][6673] ipam/ipam.go 511: Trying affinity for 192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.918 [INFO][6673] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.939892 containerd[2775]: 2025-05-27 18:09:05.921 [INFO][6673] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.940081 containerd[2775]: 2025-05-27 18:09:05.921 [INFO][6673] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.32.128/26 handle="k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.940081 containerd[2775]: 2025-05-27 18:09:05.922 [INFO][6673] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a May 27 18:09:05.940081 containerd[2775]: 2025-05-27 18:09:05.925 [INFO][6673] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.32.128/26 handle="k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.940081 containerd[2775]: 2025-05-27 18:09:05.928 [INFO][6673] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.32.131/26] block=192.168.32.128/26 handle="k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.940081 containerd[2775]: 2025-05-27 18:09:05.928 [INFO][6673] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.131/26] handle="k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:05.940081 containerd[2775]: 2025-05-27 18:09:05.928 [INFO][6673] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:09:05.940081 containerd[2775]: 2025-05-27 18:09:05.928 [INFO][6673] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.32.131/26] IPv6=[] ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" HandleID="k8s-pod-network.4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Workload="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" May 27 18:09:05.940257 containerd[2775]: 2025-05-27 18:09:05.930 [INFO][6622] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Namespace="calico-system" Pod="goldmane-8f77d7b6c-qrwrj" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"e42660bd-427d-4774-a8fa-836372ea3d07", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"", Pod:"goldmane-8f77d7b6c-qrwrj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.32.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali995577f62da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:05.940257 containerd[2775]: 2025-05-27 18:09:05.930 [INFO][6622] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.131/32] ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Namespace="calico-system" Pod="goldmane-8f77d7b6c-qrwrj" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" May 27 18:09:05.940324 containerd[2775]: 2025-05-27 18:09:05.930 [INFO][6622] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali995577f62da ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Namespace="calico-system" Pod="goldmane-8f77d7b6c-qrwrj" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" May 27 18:09:05.940324 containerd[2775]: 2025-05-27 18:09:05.932 [INFO][6622] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Namespace="calico-system" Pod="goldmane-8f77d7b6c-qrwrj" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" May 27 18:09:05.940361 containerd[2775]: 2025-05-27 18:09:05.932 [INFO][6622] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Namespace="calico-system" Pod="goldmane-8f77d7b6c-qrwrj" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"e42660bd-427d-4774-a8fa-836372ea3d07", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a", Pod:"goldmane-8f77d7b6c-qrwrj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.32.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali995577f62da", MAC:"da:a3:55:cd:c7:2e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:05.940406 containerd[2775]: 2025-05-27 18:09:05.937 [INFO][6622] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" Namespace="calico-system" Pod="goldmane-8f77d7b6c-qrwrj" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-goldmane--8f77d7b6c--qrwrj-eth0" May 27 18:09:05.949306 containerd[2775]: time="2025-05-27T18:09:05.949280551Z" level=info msg="connecting to shim 4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a" address="unix:///run/containerd/s/16626df9760065b09d82cf600aaada8c598a4d03d51cd572a40dddad76eb6392" namespace=k8s.io protocol=ttrpc version=3 May 27 18:09:05.974692 systemd[1]: Started cri-containerd-4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a.scope - libcontainer container 4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a. May 27 18:09:06.002213 containerd[2775]: time="2025-05-27T18:09:06.002138787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-qrwrj,Uid:e42660bd-427d-4774-a8fa-836372ea3d07,Namespace:calico-system,Attempt:0,} returns sandbox id \"4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a\"" May 27 18:09:06.745366 containerd[2775]: time="2025-05-27T18:09:06.745324493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-bhcfn,Uid:b3875208-89b2-4c38-9972-ddcdb9a3d15d,Namespace:kube-system,Attempt:0,}" May 27 18:09:06.817784 containerd[2775]: time="2025-05-27T18:09:06.817747978Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:06.818051 containerd[2775]: time="2025-05-27T18:09:06.817758778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=44453213" May 27 18:09:06.818465 containerd[2775]: time="2025-05-27T18:09:06.818441982Z" level=info msg="ImageCreate event name:\"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:06.820011 containerd[2775]: time="2025-05-27T18:09:06.819991229Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:06.820720 containerd[2775]: time="2025-05-27T18:09:06.820686233Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"45822470\" in 889.685338ms" May 27 18:09:06.820720 containerd[2775]: time="2025-05-27T18:09:06.820717513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:0d503660232383641bf9af3b7e4ef066c0e96a8ec586f123e5b56b6a196c983d\"" May 27 18:09:06.821428 containerd[2775]: time="2025-05-27T18:09:06.821411237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:09:06.822195 containerd[2775]: time="2025-05-27T18:09:06.822171880Z" level=info msg="CreateContainer within sandbox \"36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 18:09:06.826482 containerd[2775]: time="2025-05-27T18:09:06.826455342Z" level=info msg="Container cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2: CDI devices from CRI Config.CDIDevices: []" May 27 18:09:06.829651 containerd[2775]: time="2025-05-27T18:09:06.829626318Z" level=info msg="CreateContainer within sandbox \"36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2\"" May 27 18:09:06.829975 containerd[2775]: time="2025-05-27T18:09:06.829957520Z" level=info msg="StartContainer for \"cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2\"" May 27 18:09:06.830889 containerd[2775]: time="2025-05-27T18:09:06.830867484Z" level=info msg="connecting to shim cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2" address="unix:///run/containerd/s/1b35c03b5856df313a02d3d16deace72a80d69b34fb13741f9f37a37783071f7" protocol=ttrpc version=3 May 27 18:09:06.835100 systemd-networkd[2686]: cali40a02d937b3: Link UP May 27 18:09:06.835328 systemd-networkd[2686]: cali40a02d937b3: Gained carrier May 27 18:09:06.843342 containerd[2775]: 2025-05-27 18:09:06.763 [INFO][6889] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:09:06.843342 containerd[2775]: 2025-05-27 18:09:06.774 [INFO][6889] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0 coredns-7c65d6cfc9- kube-system b3875208-89b2-4c38-9972-ddcdb9a3d15d 775 0 2025-05-27 18:08:36 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-a-e5d745d36c coredns-7c65d6cfc9-bhcfn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali40a02d937b3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bhcfn" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-" May 27 18:09:06.843342 containerd[2775]: 2025-05-27 18:09:06.774 [INFO][6889] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bhcfn" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" May 27 18:09:06.843342 containerd[2775]: 2025-05-27 18:09:06.794 [INFO][6914] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" HandleID="k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Workload="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.794 [INFO][6914] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" HandleID="k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Workload="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001b7e70), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-a-e5d745d36c", "pod":"coredns-7c65d6cfc9-bhcfn", "timestamp":"2025-05-27 18:09:06.794421861 +0000 UTC"}, Hostname:"ci-4344.0.0-a-e5d745d36c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.794 [INFO][6914] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.794 [INFO][6914] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.794 [INFO][6914] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-e5d745d36c' May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.804 [INFO][6914] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.819 [INFO][6914] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.822 [INFO][6914] ipam/ipam.go 511: Trying affinity for 192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.824 [INFO][6914] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843598 containerd[2775]: 2025-05-27 18:09:06.825 [INFO][6914] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843780 containerd[2775]: 2025-05-27 18:09:06.825 [INFO][6914] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.32.128/26 handle="k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843780 containerd[2775]: 2025-05-27 18:09:06.826 [INFO][6914] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4 May 27 18:09:06.843780 containerd[2775]: 2025-05-27 18:09:06.828 [INFO][6914] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.32.128/26 handle="k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843780 containerd[2775]: 2025-05-27 18:09:06.832 [INFO][6914] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.32.132/26] block=192.168.32.128/26 handle="k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843780 containerd[2775]: 2025-05-27 18:09:06.832 [INFO][6914] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.132/26] handle="k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:06.843780 containerd[2775]: 2025-05-27 18:09:06.832 [INFO][6914] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:09:06.843780 containerd[2775]: 2025-05-27 18:09:06.832 [INFO][6914] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.32.132/26] IPv6=[] ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" HandleID="k8s-pod-network.9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Workload="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" May 27 18:09:06.843907 containerd[2775]: 2025-05-27 18:09:06.833 [INFO][6889] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bhcfn" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b3875208-89b2-4c38-9972-ddcdb9a3d15d", ResourceVersion:"775", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"", Pod:"coredns-7c65d6cfc9-bhcfn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.32.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali40a02d937b3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:06.843907 containerd[2775]: 2025-05-27 18:09:06.834 [INFO][6889] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.132/32] ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bhcfn" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" May 27 18:09:06.843907 containerd[2775]: 2025-05-27 18:09:06.834 [INFO][6889] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali40a02d937b3 ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bhcfn" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" May 27 18:09:06.843907 containerd[2775]: 2025-05-27 18:09:06.835 [INFO][6889] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bhcfn" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" May 27 18:09:06.843907 containerd[2775]: 2025-05-27 18:09:06.836 [INFO][6889] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bhcfn" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b3875208-89b2-4c38-9972-ddcdb9a3d15d", ResourceVersion:"775", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4", Pod:"coredns-7c65d6cfc9-bhcfn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.32.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali40a02d937b3", MAC:"06:f9:7e:70:79:e8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:06.843907 containerd[2775]: 2025-05-27 18:09:06.841 [INFO][6889] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" Namespace="kube-system" Pod="coredns-7c65d6cfc9-bhcfn" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--bhcfn-eth0" May 27 18:09:06.845182 containerd[2775]: time="2025-05-27T18:09:06.845153116Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:09:06.845426 containerd[2775]: time="2025-05-27T18:09:06.845398757Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:09:06.845467 containerd[2775]: time="2025-05-27T18:09:06.845446638Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:09:06.845606 kubelet[4358]: E0527 18:09:06.845552 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:09:06.845861 kubelet[4358]: E0527 18:09:06.845623 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:09:06.845861 kubelet[4358]: E0527 18:09:06.845770 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4wx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-qrwrj_calico-system(e42660bd-427d-4774-a8fa-836372ea3d07): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:09:06.846919 kubelet[4358]: E0527 18:09:06.846896 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:09:06.853218 containerd[2775]: time="2025-05-27T18:09:06.853192157Z" level=info msg="connecting to shim 9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4" address="unix:///run/containerd/s/de519726f5a3e2841261cee479cb2e4700326945edd810084ba21270e3dab402" namespace=k8s.io protocol=ttrpc version=3 May 27 18:09:06.860798 systemd[1]: Started cri-containerd-cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2.scope - libcontainer container cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2. May 27 18:09:06.867153 systemd[1]: Started cri-containerd-9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4.scope - libcontainer container 9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4. May 27 18:09:06.889022 containerd[2775]: time="2025-05-27T18:09:06.888997617Z" level=info msg="StartContainer for \"cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2\" returns successfully" May 27 18:09:06.891639 containerd[2775]: time="2025-05-27T18:09:06.891619070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-bhcfn,Uid:b3875208-89b2-4c38-9972-ddcdb9a3d15d,Namespace:kube-system,Attempt:0,} returns sandbox id \"9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4\"" May 27 18:09:06.893420 containerd[2775]: time="2025-05-27T18:09:06.893394919Z" level=info msg="CreateContainer within sandbox \"9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 18:09:06.900109 containerd[2775]: time="2025-05-27T18:09:06.900080833Z" level=info msg="Container 560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa: CDI devices from CRI Config.CDIDevices: []" May 27 18:09:06.902552 containerd[2775]: time="2025-05-27T18:09:06.902528005Z" level=info msg="CreateContainer within sandbox \"9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa\"" May 27 18:09:06.902899 containerd[2775]: time="2025-05-27T18:09:06.902878607Z" level=info msg="StartContainer for \"560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa\"" May 27 18:09:06.903644 containerd[2775]: time="2025-05-27T18:09:06.903624451Z" level=info msg="connecting to shim 560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa" address="unix:///run/containerd/s/de519726f5a3e2841261cee479cb2e4700326945edd810084ba21270e3dab402" protocol=ttrpc version=3 May 27 18:09:06.932711 systemd[1]: Started cri-containerd-560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa.scope - libcontainer container 560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa. May 27 18:09:06.953856 containerd[2775]: time="2025-05-27T18:09:06.953823384Z" level=info msg="StartContainer for \"560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa\" returns successfully" May 27 18:09:06.993697 systemd-networkd[2686]: cali995577f62da: Gained IPv6LL May 27 18:09:07.569697 systemd-networkd[2686]: calia042b7606da: Gained IPv6LL May 27 18:09:07.820571 kubelet[4358]: E0527 18:09:07.820485 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:09:07.826742 kubelet[4358]: I0527 18:09:07.826700 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-858f7796ff-w84v7" podStartSLOduration=20.936181734 podStartE2EDuration="21.826686916s" podCreationTimestamp="2025-05-27 18:08:46 +0000 UTC" firstStartedPulling="2025-05-27 18:09:05.930786254 +0000 UTC m=+36.256913176" lastFinishedPulling="2025-05-27 18:09:06.821291476 +0000 UTC m=+37.147418358" observedRunningTime="2025-05-27 18:09:07.826271354 +0000 UTC m=+38.152398316" watchObservedRunningTime="2025-05-27 18:09:07.826686916 +0000 UTC m=+38.152813838" May 27 18:09:07.833574 kubelet[4358]: I0527 18:09:07.833532 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-bhcfn" podStartSLOduration=31.833520429 podStartE2EDuration="31.833520429s" podCreationTimestamp="2025-05-27 18:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:09:07.833340309 +0000 UTC m=+38.159467271" watchObservedRunningTime="2025-05-27 18:09:07.833520429 +0000 UTC m=+38.159647351" May 27 18:09:07.954675 systemd-networkd[2686]: cali40a02d937b3: Gained IPv6LL May 27 18:09:08.821741 kubelet[4358]: I0527 18:09:08.821708 4358 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:09:09.745119 containerd[2775]: time="2025-05-27T18:09:09.745038594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d6b48788b-blbsr,Uid:9d777eb3-dc96-46f0-ba14-4404408d1bf1,Namespace:calico-system,Attempt:0,}" May 27 18:09:09.745427 containerd[2775]: time="2025-05-27T18:09:09.745118594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-v6s4f,Uid:6d206e1a-532c-4375-9afc-8ff133d58b02,Namespace:calico-system,Attempt:0,}" May 27 18:09:09.745427 containerd[2775]: time="2025-05-27T18:09:09.745038394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-858f7796ff-d7gxq,Uid:9ea28421-4e19-420d-bfa8-98dc649e6584,Namespace:calico-apiserver,Attempt:0,}" May 27 18:09:09.745427 containerd[2775]: time="2025-05-27T18:09:09.745039474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hzcq6,Uid:1afbaf81-9802-4311-a5e3-ec291af87c54,Namespace:kube-system,Attempt:0,}" May 27 18:09:09.821845 systemd-networkd[2686]: calic96dbb095e5: Link UP May 27 18:09:09.822058 systemd-networkd[2686]: calic96dbb095e5: Gained carrier May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.763 [INFO][7295] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.774 [INFO][7295] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0 calico-kube-controllers-6d6b48788b- calico-system 9d777eb3-dc96-46f0-ba14-4404408d1bf1 779 0 2025-05-27 18:08:50 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6d6b48788b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344.0.0-a-e5d745d36c calico-kube-controllers-6d6b48788b-blbsr eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic96dbb095e5 [] [] }} ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Namespace="calico-system" Pod="calico-kube-controllers-6d6b48788b-blbsr" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.774 [INFO][7295] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Namespace="calico-system" Pod="calico-kube-controllers-6d6b48788b-blbsr" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.795 [INFO][7397] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" HandleID="k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.795 [INFO][7397] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" HandleID="k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400045d8b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-e5d745d36c", "pod":"calico-kube-controllers-6d6b48788b-blbsr", "timestamp":"2025-05-27 18:09:09.795097141 +0000 UTC"}, Hostname:"ci-4344.0.0-a-e5d745d36c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.795 [INFO][7397] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.795 [INFO][7397] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.795 [INFO][7397] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-e5d745d36c' May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.803 [INFO][7397] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.806 [INFO][7397] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.808 [INFO][7397] ipam/ipam.go 511: Trying affinity for 192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.810 [INFO][7397] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.811 [INFO][7397] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.811 [INFO][7397] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.32.128/26 handle="k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.812 [INFO][7397] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.815 [INFO][7397] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.32.128/26 handle="k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.818 [INFO][7397] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.32.133/26] block=192.168.32.128/26 handle="k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.818 [INFO][7397] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.133/26] handle="k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.818 [INFO][7397] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:09:09.830403 containerd[2775]: 2025-05-27 18:09:09.819 [INFO][7397] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.32.133/26] IPv6=[] ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" HandleID="k8s-pod-network.fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" May 27 18:09:09.830886 containerd[2775]: 2025-05-27 18:09:09.820 [INFO][7295] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Namespace="calico-system" Pod="calico-kube-controllers-6d6b48788b-blbsr" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0", GenerateName:"calico-kube-controllers-6d6b48788b-", Namespace:"calico-system", SelfLink:"", UID:"9d777eb3-dc96-46f0-ba14-4404408d1bf1", ResourceVersion:"779", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d6b48788b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"", Pod:"calico-kube-controllers-6d6b48788b-blbsr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.32.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic96dbb095e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:09.830886 containerd[2775]: 2025-05-27 18:09:09.820 [INFO][7295] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.133/32] ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Namespace="calico-system" Pod="calico-kube-controllers-6d6b48788b-blbsr" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" May 27 18:09:09.830886 containerd[2775]: 2025-05-27 18:09:09.820 [INFO][7295] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic96dbb095e5 ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Namespace="calico-system" Pod="calico-kube-controllers-6d6b48788b-blbsr" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" May 27 18:09:09.830886 containerd[2775]: 2025-05-27 18:09:09.822 [INFO][7295] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Namespace="calico-system" Pod="calico-kube-controllers-6d6b48788b-blbsr" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" May 27 18:09:09.830886 containerd[2775]: 2025-05-27 18:09:09.822 [INFO][7295] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Namespace="calico-system" Pod="calico-kube-controllers-6d6b48788b-blbsr" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0", GenerateName:"calico-kube-controllers-6d6b48788b-", Namespace:"calico-system", SelfLink:"", UID:"9d777eb3-dc96-46f0-ba14-4404408d1bf1", ResourceVersion:"779", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6d6b48788b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d", Pod:"calico-kube-controllers-6d6b48788b-blbsr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.32.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic96dbb095e5", MAC:"92:b3:c1:73:98:03", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:09.830886 containerd[2775]: 2025-05-27 18:09:09.829 [INFO][7295] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" Namespace="calico-system" Pod="calico-kube-controllers-6d6b48788b-blbsr" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--kube--controllers--6d6b48788b--blbsr-eth0" May 27 18:09:09.839895 containerd[2775]: time="2025-05-27T18:09:09.839862984Z" level=info msg="connecting to shim fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d" address="unix:///run/containerd/s/e6953ae0de1df71d01dedef8fea8a134868937d866961b94a6b34ed6898d90cf" namespace=k8s.io protocol=ttrpc version=3 May 27 18:09:09.869696 systemd[1]: Started cri-containerd-fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d.scope - libcontainer container fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d. May 27 18:09:09.895105 containerd[2775]: time="2025-05-27T18:09:09.895076114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6d6b48788b-blbsr,Uid:9d777eb3-dc96-46f0-ba14-4404408d1bf1,Namespace:calico-system,Attempt:0,} returns sandbox id \"fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d\"" May 27 18:09:09.896072 containerd[2775]: time="2025-05-27T18:09:09.896053719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 18:09:09.924550 systemd-networkd[2686]: cali51622486ff4: Link UP May 27 18:09:09.924762 systemd-networkd[2686]: cali51622486ff4: Gained carrier May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.763 [INFO][7301] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.778 [INFO][7301] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0 csi-node-driver- calico-system 6d206e1a-532c-4375-9afc-8ff133d58b02 643 0 2025-05-27 18:08:49 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:68bf44dd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344.0.0-a-e5d745d36c csi-node-driver-v6s4f eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali51622486ff4 [] [] }} ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Namespace="calico-system" Pod="csi-node-driver-v6s4f" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.778 [INFO][7301] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Namespace="calico-system" Pod="csi-node-driver-v6s4f" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.797 [INFO][7403] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" HandleID="k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Workload="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.797 [INFO][7403] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" HandleID="k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Workload="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000633830), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-e5d745d36c", "pod":"csi-node-driver-v6s4f", "timestamp":"2025-05-27 18:09:09.797639752 +0000 UTC"}, Hostname:"ci-4344.0.0-a-e5d745d36c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.797 [INFO][7403] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.819 [INFO][7403] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.819 [INFO][7403] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-e5d745d36c' May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.903 [INFO][7403] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.907 [INFO][7403] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.911 [INFO][7403] ipam/ipam.go 511: Trying affinity for 192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.912 [INFO][7403] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.914 [INFO][7403] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.914 [INFO][7403] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.32.128/26 handle="k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.915 [INFO][7403] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.917 [INFO][7403] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.32.128/26 handle="k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.921 [INFO][7403] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.32.134/26] block=192.168.32.128/26 handle="k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.921 [INFO][7403] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.134/26] handle="k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.921 [INFO][7403] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:09:09.933119 containerd[2775]: 2025-05-27 18:09:09.921 [INFO][7403] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.32.134/26] IPv6=[] ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" HandleID="k8s-pod-network.9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Workload="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" May 27 18:09:09.933568 containerd[2775]: 2025-05-27 18:09:09.923 [INFO][7301] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Namespace="calico-system" Pod="csi-node-driver-v6s4f" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6d206e1a-532c-4375-9afc-8ff133d58b02", ResourceVersion:"643", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"", Pod:"csi-node-driver-v6s4f", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.32.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali51622486ff4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:09.933568 containerd[2775]: 2025-05-27 18:09:09.923 [INFO][7301] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.134/32] ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Namespace="calico-system" Pod="csi-node-driver-v6s4f" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" May 27 18:09:09.933568 containerd[2775]: 2025-05-27 18:09:09.923 [INFO][7301] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali51622486ff4 ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Namespace="calico-system" Pod="csi-node-driver-v6s4f" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" May 27 18:09:09.933568 containerd[2775]: 2025-05-27 18:09:09.924 [INFO][7301] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Namespace="calico-system" Pod="csi-node-driver-v6s4f" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" May 27 18:09:09.933568 containerd[2775]: 2025-05-27 18:09:09.925 [INFO][7301] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Namespace="calico-system" Pod="csi-node-driver-v6s4f" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6d206e1a-532c-4375-9afc-8ff133d58b02", ResourceVersion:"643", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a", Pod:"csi-node-driver-v6s4f", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.32.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali51622486ff4", MAC:"f6:7f:fd:f8:fe:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:09.933568 containerd[2775]: 2025-05-27 18:09:09.931 [INFO][7301] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" Namespace="calico-system" Pod="csi-node-driver-v6s4f" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-csi--node--driver--v6s4f-eth0" May 27 18:09:09.950472 containerd[2775]: time="2025-05-27T18:09:09.950440086Z" level=info msg="connecting to shim 9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a" address="unix:///run/containerd/s/eb54bfcaa0ceb2d6321b9cfdc9afacb3717ecb8ba29c8b75e7d32e3cbbd9bf43" namespace=k8s.io protocol=ttrpc version=3 May 27 18:09:09.977694 systemd[1]: Started cri-containerd-9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a.scope - libcontainer container 9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a. May 27 18:09:09.995460 containerd[2775]: time="2025-05-27T18:09:09.995402690Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-v6s4f,Uid:6d206e1a-532c-4375-9afc-8ff133d58b02,Namespace:calico-system,Attempt:0,} returns sandbox id \"9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a\"" May 27 18:09:10.024900 systemd-networkd[2686]: calie07ace8809a: Link UP May 27 18:09:10.025118 systemd-networkd[2686]: calie07ace8809a: Gained carrier May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:09.763 [INFO][7315] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:09.778 [INFO][7315] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0 calico-apiserver-858f7796ff- calico-apiserver 9ea28421-4e19-420d-bfa8-98dc649e6584 780 0 2025-05-27 18:08:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:858f7796ff projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-a-e5d745d36c calico-apiserver-858f7796ff-d7gxq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie07ace8809a [] [] }} ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-d7gxq" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:09.778 [INFO][7315] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-d7gxq" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:09.798 [INFO][7409] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" HandleID="k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:09.798 [INFO][7409] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" HandleID="k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003df2e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-a-e5d745d36c", "pod":"calico-apiserver-858f7796ff-d7gxq", "timestamp":"2025-05-27 18:09:09.798276515 +0000 UTC"}, Hostname:"ci-4344.0.0-a-e5d745d36c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:09.798 [INFO][7409] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:09.921 [INFO][7409] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:09.921 [INFO][7409] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-e5d745d36c' May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.004 [INFO][7409] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.008 [INFO][7409] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.011 [INFO][7409] ipam/ipam.go 511: Trying affinity for 192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.012 [INFO][7409] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.014 [INFO][7409] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.014 [INFO][7409] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.32.128/26 handle="k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.015 [INFO][7409] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.018 [INFO][7409] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.32.128/26 handle="k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.022 [INFO][7409] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.32.135/26] block=192.168.32.128/26 handle="k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.022 [INFO][7409] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.135/26] handle="k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.022 [INFO][7409] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:09:10.033360 containerd[2775]: 2025-05-27 18:09:10.022 [INFO][7409] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.32.135/26] IPv6=[] ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" HandleID="k8s-pod-network.72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Workload="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" May 27 18:09:10.033852 containerd[2775]: 2025-05-27 18:09:10.023 [INFO][7315] cni-plugin/k8s.go 418: Populated endpoint ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-d7gxq" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0", GenerateName:"calico-apiserver-858f7796ff-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ea28421-4e19-420d-bfa8-98dc649e6584", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"858f7796ff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"", Pod:"calico-apiserver-858f7796ff-d7gxq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.32.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie07ace8809a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:10.033852 containerd[2775]: 2025-05-27 18:09:10.023 [INFO][7315] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.135/32] ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-d7gxq" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" May 27 18:09:10.033852 containerd[2775]: 2025-05-27 18:09:10.023 [INFO][7315] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie07ace8809a ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-d7gxq" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" May 27 18:09:10.033852 containerd[2775]: 2025-05-27 18:09:10.025 [INFO][7315] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-d7gxq" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" May 27 18:09:10.033852 containerd[2775]: 2025-05-27 18:09:10.025 [INFO][7315] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-d7gxq" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0", GenerateName:"calico-apiserver-858f7796ff-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ea28421-4e19-420d-bfa8-98dc649e6584", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"858f7796ff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad", Pod:"calico-apiserver-858f7796ff-d7gxq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.32.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie07ace8809a", MAC:"c6:7d:34:2d:94:ca", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:10.033852 containerd[2775]: 2025-05-27 18:09:10.032 [INFO][7315] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" Namespace="calico-apiserver" Pod="calico-apiserver-858f7796ff-d7gxq" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-calico--apiserver--858f7796ff--d7gxq-eth0" May 27 18:09:10.043475 containerd[2775]: time="2025-05-27T18:09:10.043448421Z" level=info msg="connecting to shim 72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad" address="unix:///run/containerd/s/0695b80a9e4e5a02ea7f2d7ca8dbae6d2db3c57ac2eaa5b594437e6738807ec4" namespace=k8s.io protocol=ttrpc version=3 May 27 18:09:10.077711 systemd[1]: Started cri-containerd-72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad.scope - libcontainer container 72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad. May 27 18:09:10.104079 containerd[2775]: time="2025-05-27T18:09:10.104051087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-858f7796ff-d7gxq,Uid:9ea28421-4e19-420d-bfa8-98dc649e6584,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad\"" May 27 18:09:10.105885 containerd[2775]: time="2025-05-27T18:09:10.105861455Z" level=info msg="CreateContainer within sandbox \"72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 18:09:10.111421 containerd[2775]: time="2025-05-27T18:09:10.111387439Z" level=info msg="Container bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8: CDI devices from CRI Config.CDIDevices: []" May 27 18:09:10.114555 containerd[2775]: time="2025-05-27T18:09:10.114532573Z" level=info msg="CreateContainer within sandbox \"72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8\"" May 27 18:09:10.114914 containerd[2775]: time="2025-05-27T18:09:10.114892255Z" level=info msg="StartContainer for \"bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8\"" May 27 18:09:10.115843 containerd[2775]: time="2025-05-27T18:09:10.115819659Z" level=info msg="connecting to shim bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8" address="unix:///run/containerd/s/0695b80a9e4e5a02ea7f2d7ca8dbae6d2db3c57ac2eaa5b594437e6738807ec4" protocol=ttrpc version=3 May 27 18:09:10.126588 systemd-networkd[2686]: cali841f9ab28b4: Link UP May 27 18:09:10.126812 systemd-networkd[2686]: cali841f9ab28b4: Gained carrier May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:09.764 [INFO][7304] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:09.778 [INFO][7304] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0 coredns-7c65d6cfc9- kube-system 1afbaf81-9802-4311-a5e3-ec291af87c54 771 0 2025-05-27 18:08:36 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-a-e5d745d36c coredns-7c65d6cfc9-hzcq6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali841f9ab28b4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hzcq6" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:09.778 [INFO][7304] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hzcq6" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:09.798 [INFO][7411] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" HandleID="k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Workload="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:09.799 [INFO][7411] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" HandleID="k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Workload="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400045c4d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-a-e5d745d36c", "pod":"coredns-7c65d6cfc9-hzcq6", "timestamp":"2025-05-27 18:09:09.798928078 +0000 UTC"}, Hostname:"ci-4344.0.0-a-e5d745d36c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:09.799 [INFO][7411] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.022 [INFO][7411] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.022 [INFO][7411] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-e5d745d36c' May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.105 [INFO][7411] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.108 [INFO][7411] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.112 [INFO][7411] ipam/ipam.go 511: Trying affinity for 192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.113 [INFO][7411] ipam/ipam.go 158: Attempting to load block cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.115 [INFO][7411] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.32.128/26 host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.115 [INFO][7411] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.32.128/26 handle="k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.116 [INFO][7411] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484 May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.119 [INFO][7411] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.32.128/26 handle="k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.123 [INFO][7411] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.32.136/26] block=192.168.32.128/26 handle="k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.123 [INFO][7411] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.32.136/26] handle="k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" host="ci-4344.0.0-a-e5d745d36c" May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.123 [INFO][7411] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:09:10.134567 containerd[2775]: 2025-05-27 18:09:10.123 [INFO][7411] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.32.136/26] IPv6=[] ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" HandleID="k8s-pod-network.44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Workload="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" May 27 18:09:10.134995 containerd[2775]: 2025-05-27 18:09:10.125 [INFO][7304] cni-plugin/k8s.go 418: Populated endpoint ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hzcq6" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1afbaf81-9802-4311-a5e3-ec291af87c54", ResourceVersion:"771", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"", Pod:"coredns-7c65d6cfc9-hzcq6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.32.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali841f9ab28b4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:10.134995 containerd[2775]: 2025-05-27 18:09:10.125 [INFO][7304] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.32.136/32] ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hzcq6" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" May 27 18:09:10.134995 containerd[2775]: 2025-05-27 18:09:10.125 [INFO][7304] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali841f9ab28b4 ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hzcq6" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" May 27 18:09:10.134995 containerd[2775]: 2025-05-27 18:09:10.126 [INFO][7304] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hzcq6" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" May 27 18:09:10.134995 containerd[2775]: 2025-05-27 18:09:10.127 [INFO][7304] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hzcq6" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1afbaf81-9802-4311-a5e3-ec291af87c54", ResourceVersion:"771", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 8, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-e5d745d36c", ContainerID:"44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484", Pod:"coredns-7c65d6cfc9-hzcq6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.32.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali841f9ab28b4", MAC:"16:17:6f:57:b2:db", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:09:10.134995 containerd[2775]: 2025-05-27 18:09:10.132 [INFO][7304] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hzcq6" WorkloadEndpoint="ci--4344.0.0--a--e5d745d36c-k8s-coredns--7c65d6cfc9--hzcq6-eth0" May 27 18:09:10.144945 containerd[2775]: time="2025-05-27T18:09:10.144915307Z" level=info msg="connecting to shim 44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484" address="unix:///run/containerd/s/e647e38cf11ad854a7d63aa6cad7df3ed1b3618bfb8b0c821a7d6db04df2e9cd" namespace=k8s.io protocol=ttrpc version=3 May 27 18:09:10.147706 systemd[1]: Started cri-containerd-bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8.scope - libcontainer container bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8. May 27 18:09:10.158892 systemd[1]: Started cri-containerd-44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484.scope - libcontainer container 44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484. May 27 18:09:10.176536 containerd[2775]: time="2025-05-27T18:09:10.176503085Z" level=info msg="StartContainer for \"bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8\" returns successfully" May 27 18:09:10.184378 containerd[2775]: time="2025-05-27T18:09:10.184348800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hzcq6,Uid:1afbaf81-9802-4311-a5e3-ec291af87c54,Namespace:kube-system,Attempt:0,} returns sandbox id \"44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484\"" May 27 18:09:10.186196 containerd[2775]: time="2025-05-27T18:09:10.186169928Z" level=info msg="CreateContainer within sandbox \"44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 18:09:10.190457 containerd[2775]: time="2025-05-27T18:09:10.190429546Z" level=info msg="Container b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f: CDI devices from CRI Config.CDIDevices: []" May 27 18:09:10.193044 containerd[2775]: time="2025-05-27T18:09:10.193010918Z" level=info msg="CreateContainer within sandbox \"44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f\"" May 27 18:09:10.193386 containerd[2775]: time="2025-05-27T18:09:10.193366239Z" level=info msg="StartContainer for \"b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f\"" May 27 18:09:10.194621 containerd[2775]: time="2025-05-27T18:09:10.194596445Z" level=info msg="connecting to shim b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f" address="unix:///run/containerd/s/e647e38cf11ad854a7d63aa6cad7df3ed1b3618bfb8b0c821a7d6db04df2e9cd" protocol=ttrpc version=3 May 27 18:09:10.228766 systemd[1]: Started cri-containerd-b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f.scope - libcontainer container b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f. May 27 18:09:10.250793 containerd[2775]: time="2025-05-27T18:09:10.250722611Z" level=info msg="StartContainer for \"b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f\" returns successfully" May 27 18:09:10.727106 containerd[2775]: time="2025-05-27T18:09:10.727063222Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=48045219" May 27 18:09:10.727106 containerd[2775]: time="2025-05-27T18:09:10.727066542Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:10.727767 containerd[2775]: time="2025-05-27T18:09:10.727742905Z" level=info msg="ImageCreate event name:\"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:10.729278 containerd[2775]: time="2025-05-27T18:09:10.729262512Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:10.729885 containerd[2775]: time="2025-05-27T18:09:10.729862154Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"49414428\" in 833.782115ms" May 27 18:09:10.729920 containerd[2775]: time="2025-05-27T18:09:10.729891715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:4188fe2931435deda58a0dc1767a2f6ad2bb27e47662ccec626bd07006f56373\"" May 27 18:09:10.730628 containerd[2775]: time="2025-05-27T18:09:10.730609998Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 18:09:10.734980 containerd[2775]: time="2025-05-27T18:09:10.734959657Z" level=info msg="CreateContainer within sandbox \"fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 18:09:10.738569 containerd[2775]: time="2025-05-27T18:09:10.738548193Z" level=info msg="Container 7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016: CDI devices from CRI Config.CDIDevices: []" May 27 18:09:10.741834 containerd[2775]: time="2025-05-27T18:09:10.741812127Z" level=info msg="CreateContainer within sandbox \"fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\"" May 27 18:09:10.742086 containerd[2775]: time="2025-05-27T18:09:10.742067848Z" level=info msg="StartContainer for \"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\"" May 27 18:09:10.743016 containerd[2775]: time="2025-05-27T18:09:10.742996372Z" level=info msg="connecting to shim 7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016" address="unix:///run/containerd/s/e6953ae0de1df71d01dedef8fea8a134868937d866961b94a6b34ed6898d90cf" protocol=ttrpc version=3 May 27 18:09:10.779695 systemd[1]: Started cri-containerd-7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016.scope - libcontainer container 7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016. May 27 18:09:10.808439 containerd[2775]: time="2025-05-27T18:09:10.808409939Z" level=info msg="StartContainer for \"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" returns successfully" May 27 18:09:10.834089 kubelet[4358]: I0527 18:09:10.834044 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-hzcq6" podStartSLOduration=34.834028932 podStartE2EDuration="34.834028932s" podCreationTimestamp="2025-05-27 18:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:09:10.833521489 +0000 UTC m=+41.159648411" watchObservedRunningTime="2025-05-27 18:09:10.834028932 +0000 UTC m=+41.160155854" May 27 18:09:10.840928 kubelet[4358]: I0527 18:09:10.840888 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6d6b48788b-blbsr" podStartSLOduration=20.006283443 podStartE2EDuration="20.840872322s" podCreationTimestamp="2025-05-27 18:08:50 +0000 UTC" firstStartedPulling="2025-05-27 18:09:09.895871598 +0000 UTC m=+40.221998520" lastFinishedPulling="2025-05-27 18:09:10.730460477 +0000 UTC m=+41.056587399" observedRunningTime="2025-05-27 18:09:10.840722041 +0000 UTC m=+41.166848963" watchObservedRunningTime="2025-05-27 18:09:10.840872322 +0000 UTC m=+41.166999204" May 27 18:09:10.855838 kubelet[4358]: I0527 18:09:10.855791 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-858f7796ff-d7gxq" podStartSLOduration=24.855775387 podStartE2EDuration="24.855775387s" podCreationTimestamp="2025-05-27 18:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:09:10.855690467 +0000 UTC m=+41.181817389" watchObservedRunningTime="2025-05-27 18:09:10.855775387 +0000 UTC m=+41.181902309" May 27 18:09:10.869460 containerd[2775]: time="2025-05-27T18:09:10.869422407Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"8d721546a7a8bcf1db853a0b8ce525a71f53965b5ee6bbedb374fc3eaf80a924\" pid:7916 exit_status:1 exited_at:{seconds:1748369350 nanos:869063245}" May 27 18:09:10.897695 systemd-networkd[2686]: calic96dbb095e5: Gained IPv6LL May 27 18:09:11.119662 containerd[2775]: time="2025-05-27T18:09:11.119613169Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:11.119830 containerd[2775]: time="2025-05-27T18:09:11.119743969Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8226240" May 27 18:09:11.120476 containerd[2775]: time="2025-05-27T18:09:11.120449092Z" level=info msg="ImageCreate event name:\"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:11.122050 containerd[2775]: time="2025-05-27T18:09:11.122029779Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:11.122672 containerd[2775]: time="2025-05-27T18:09:11.122594222Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"9595481\" in 391.948184ms" May 27 18:09:11.122672 containerd[2775]: time="2025-05-27T18:09:11.122624462Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:ebe7e098653491dec9f15f87d7f5d33f47b09d1d6f3ef83deeaaa6237024c045\"" May 27 18:09:11.124229 containerd[2775]: time="2025-05-27T18:09:11.124206268Z" level=info msg="CreateContainer within sandbox \"9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 18:09:11.129200 containerd[2775]: time="2025-05-27T18:09:11.129160809Z" level=info msg="Container 015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236: CDI devices from CRI Config.CDIDevices: []" May 27 18:09:11.133171 containerd[2775]: time="2025-05-27T18:09:11.133144786Z" level=info msg="CreateContainer within sandbox \"9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236\"" May 27 18:09:11.133551 containerd[2775]: time="2025-05-27T18:09:11.133498588Z" level=info msg="StartContainer for \"015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236\"" May 27 18:09:11.134833 containerd[2775]: time="2025-05-27T18:09:11.134811834Z" level=info msg="connecting to shim 015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236" address="unix:///run/containerd/s/eb54bfcaa0ceb2d6321b9cfdc9afacb3717ecb8ba29c8b75e7d32e3cbbd9bf43" protocol=ttrpc version=3 May 27 18:09:11.169697 systemd[1]: Started cri-containerd-015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236.scope - libcontainer container 015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236. May 27 18:09:11.196940 containerd[2775]: time="2025-05-27T18:09:11.196905378Z" level=info msg="StartContainer for \"015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236\" returns successfully" May 27 18:09:11.197783 containerd[2775]: time="2025-05-27T18:09:11.197758861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 18:09:11.281720 systemd-networkd[2686]: calie07ace8809a: Gained IPv6LL May 27 18:09:11.409707 systemd-networkd[2686]: cali841f9ab28b4: Gained IPv6LL May 27 18:09:11.683534 containerd[2775]: time="2025-05-27T18:09:11.683434606Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:11.683534 containerd[2775]: time="2025-05-27T18:09:11.683498486Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=13749925" May 27 18:09:11.684166 containerd[2775]: time="2025-05-27T18:09:11.684140329Z" level=info msg="ImageCreate event name:\"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:11.685689 containerd[2775]: time="2025-05-27T18:09:11.685662776Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:09:11.686343 containerd[2775]: time="2025-05-27T18:09:11.686318538Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"15119118\" in 488.528957ms" May 27 18:09:11.686400 containerd[2775]: time="2025-05-27T18:09:11.686343978Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:a5d5f2a68204ed0dbc50f8778616ee92a63c0e342d178a4620e6271484e5c8b2\"" May 27 18:09:11.687970 containerd[2775]: time="2025-05-27T18:09:11.687946505Z" level=info msg="CreateContainer within sandbox \"9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 18:09:11.692357 containerd[2775]: time="2025-05-27T18:09:11.692327004Z" level=info msg="Container ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3: CDI devices from CRI Config.CDIDevices: []" May 27 18:09:11.696770 containerd[2775]: time="2025-05-27T18:09:11.696741143Z" level=info msg="CreateContainer within sandbox \"9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3\"" May 27 18:09:11.697077 containerd[2775]: time="2025-05-27T18:09:11.697055104Z" level=info msg="StartContainer for \"ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3\"" May 27 18:09:11.698318 containerd[2775]: time="2025-05-27T18:09:11.698293869Z" level=info msg="connecting to shim ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3" address="unix:///run/containerd/s/eb54bfcaa0ceb2d6321b9cfdc9afacb3717ecb8ba29c8b75e7d32e3cbbd9bf43" protocol=ttrpc version=3 May 27 18:09:11.726696 systemd[1]: Started cri-containerd-ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3.scope - libcontainer container ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3. May 27 18:09:11.730669 systemd-networkd[2686]: cali51622486ff4: Gained IPv6LL May 27 18:09:11.753538 containerd[2775]: time="2025-05-27T18:09:11.753511344Z" level=info msg="StartContainer for \"ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3\" returns successfully" May 27 18:09:11.790228 kubelet[4358]: I0527 18:09:11.790207 4358 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 18:09:11.790296 kubelet[4358]: I0527 18:09:11.790235 4358 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 18:09:11.832795 kubelet[4358]: I0527 18:09:11.832771 4358 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:09:11.851066 kubelet[4358]: I0527 18:09:11.851024 4358 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-v6s4f" podStartSLOduration=21.160393032 podStartE2EDuration="22.851010799s" podCreationTimestamp="2025-05-27 18:08:49 +0000 UTC" firstStartedPulling="2025-05-27 18:09:09.996281694 +0000 UTC m=+40.322408616" lastFinishedPulling="2025-05-27 18:09:11.686899461 +0000 UTC m=+42.013026383" observedRunningTime="2025-05-27 18:09:11.850750317 +0000 UTC m=+42.176877239" watchObservedRunningTime="2025-05-27 18:09:11.851010799 +0000 UTC m=+42.177137721" May 27 18:09:11.864644 containerd[2775]: time="2025-05-27T18:09:11.864613456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"a3a909df90e1cd7e11515dfb033f7a9fcef4c72146cfe7ff701e1013384f8e53\" pid:8118 exited_at:{seconds:1748369351 nanos:864417176}" May 27 18:09:12.669412 kubelet[4358]: I0527 18:09:12.669369 4358 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:09:12.739589 containerd[2775]: time="2025-05-27T18:09:12.739550400Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"7367a752957ffa5c563504590b4b0e7e157ce55d779a278be44b9021638d628a\" pid:8195 exit_status:1 exited_at:{seconds:1748369352 nanos:739324600}" May 27 18:09:12.808802 containerd[2775]: time="2025-05-27T18:09:12.808765526Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"2247a54d46987d2262a31f800f5bb2bd162756fe89f82284efdfa0da78056b6a\" pid:8230 exit_status:1 exited_at:{seconds:1748369352 nanos:808593645}" May 27 18:09:13.150400 kubelet[4358]: I0527 18:09:13.150364 4358 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:09:13.745818 containerd[2775]: time="2025-05-27T18:09:13.745782498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:09:13.795556 containerd[2775]: time="2025-05-27T18:09:13.795523057Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:09:13.795759 containerd[2775]: time="2025-05-27T18:09:13.795731578Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:09:13.795810 containerd[2775]: time="2025-05-27T18:09:13.795731938Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:09:13.795928 kubelet[4358]: E0527 18:09:13.795888 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:09:13.795999 kubelet[4358]: E0527 18:09:13.795934 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:09:13.796053 kubelet[4358]: E0527 18:09:13.796021 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:7372ac9998994bc2afef8b2a28678866,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:09:13.797671 containerd[2775]: time="2025-05-27T18:09:13.797649385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:09:13.822131 containerd[2775]: time="2025-05-27T18:09:13.822090363Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:09:13.822361 containerd[2775]: time="2025-05-27T18:09:13.822335244Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:09:13.822416 containerd[2775]: time="2025-05-27T18:09:13.822350084Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:09:13.822492 kubelet[4358]: E0527 18:09:13.822465 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:09:13.822529 kubelet[4358]: E0527 18:09:13.822496 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:09:13.822606 kubelet[4358]: E0527 18:09:13.822565 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:09:13.823743 kubelet[4358]: E0527 18:09:13.823711 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:09:14.426839 systemd-networkd[2686]: vxlan.calico: Link UP May 27 18:09:14.426843 systemd-networkd[2686]: vxlan.calico: Gained carrier May 27 18:09:16.081683 systemd-networkd[2686]: vxlan.calico: Gained IPv6LL May 27 18:09:18.745344 containerd[2775]: time="2025-05-27T18:09:18.745311016Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:09:18.829097 containerd[2775]: time="2025-05-27T18:09:18.829063069Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:09:18.829297 containerd[2775]: time="2025-05-27T18:09:18.829274549Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:09:18.829361 containerd[2775]: time="2025-05-27T18:09:18.829329590Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:09:18.829443 kubelet[4358]: E0527 18:09:18.829407 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:09:18.829708 kubelet[4358]: E0527 18:09:18.829453 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:09:18.829708 kubelet[4358]: E0527 18:09:18.829550 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4wx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-qrwrj_calico-system(e42660bd-427d-4774-a8fa-836372ea3d07): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:09:18.830724 kubelet[4358]: E0527 18:09:18.830697 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:09:26.746189 kubelet[4358]: E0527 18:09:26.746138 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:09:29.745756 kubelet[4358]: E0527 18:09:29.745721 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:09:29.856900 kubelet[4358]: I0527 18:09:29.856868 4358 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:09:32.154890 containerd[2775]: time="2025-05-27T18:09:32.154850013Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"aded1812cc72e964882e4915eb2af8408b62341e3ae29f0a9a6e1a9c438ac9a9\" pid:8672 exited_at:{seconds:1748369372 nanos:154674813}" May 27 18:09:34.308638 containerd[2775]: time="2025-05-27T18:09:34.308600085Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"0b03ef5f1f8d773d18a4cd9b1afad1d61b3fd668db005d0861150e29f652dbaf\" pid:8694 exited_at:{seconds:1748369374 nanos:308436884}" May 27 18:09:38.744762 containerd[2775]: time="2025-05-27T18:09:38.744711427Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:09:38.770367 containerd[2775]: time="2025-05-27T18:09:38.770329732Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:09:38.770611 containerd[2775]: time="2025-05-27T18:09:38.770577693Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:09:38.770681 containerd[2775]: time="2025-05-27T18:09:38.770593053Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:09:38.770771 kubelet[4358]: E0527 18:09:38.770735 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:09:38.771011 kubelet[4358]: E0527 18:09:38.770778 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:09:38.771011 kubelet[4358]: E0527 18:09:38.770861 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:7372ac9998994bc2afef8b2a28678866,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:09:38.772493 containerd[2775]: time="2025-05-27T18:09:38.772472738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:09:38.798112 containerd[2775]: time="2025-05-27T18:09:38.798069043Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:09:38.798335 containerd[2775]: time="2025-05-27T18:09:38.798307684Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:09:38.798379 containerd[2775]: time="2025-05-27T18:09:38.798336044Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:09:38.798482 kubelet[4358]: E0527 18:09:38.798444 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:09:38.798527 kubelet[4358]: E0527 18:09:38.798490 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:09:38.798665 kubelet[4358]: E0527 18:09:38.798635 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:09:38.799804 kubelet[4358]: E0527 18:09:38.799774 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:09:42.744260 containerd[2775]: time="2025-05-27T18:09:42.744218363Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"e2f24e5dabd37d5bad14eeb5d2c4fe92f84ec9faefce795533b5201da5aa75b9\" pid:8747 exited_at:{seconds:1748369382 nanos:744000841}" May 27 18:09:44.745008 containerd[2775]: time="2025-05-27T18:09:44.744972040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:09:44.776461 containerd[2775]: time="2025-05-27T18:09:44.776418258Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:09:44.776699 containerd[2775]: time="2025-05-27T18:09:44.776665460Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:09:44.776766 containerd[2775]: time="2025-05-27T18:09:44.776717380Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:09:44.776845 kubelet[4358]: E0527 18:09:44.776809 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:09:44.777079 kubelet[4358]: E0527 18:09:44.776853 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:09:44.777079 kubelet[4358]: E0527 18:09:44.776982 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4wx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-qrwrj_calico-system(e42660bd-427d-4774-a8fa-836372ea3d07): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:09:44.778152 kubelet[4358]: E0527 18:09:44.778123 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:09:49.795649 kubelet[4358]: I0527 18:09:49.795613 4358 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:09:53.744955 kubelet[4358]: E0527 18:09:53.744891 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:09:55.745737 kubelet[4358]: E0527 18:09:55.745686 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:10:04.307574 containerd[2775]: time="2025-05-27T18:10:04.307528972Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"9d158bb3302b2951f32c49d65565b300ffdc0f11d9c1a38ff00a86754b3ba95c\" pid:8816 exited_at:{seconds:1748369404 nanos:307360211}" May 27 18:10:06.745031 kubelet[4358]: E0527 18:10:06.744970 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:10:07.745359 kubelet[4358]: E0527 18:10:07.745307 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:10:12.744115 containerd[2775]: time="2025-05-27T18:10:12.744057211Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"84761a9fae59f9254173ac84877039285d2079f86c6ab0a5cb6f182302836865\" pid:8840 exited_at:{seconds:1748369412 nanos:743778609}" May 27 18:10:17.745212 kubelet[4358]: E0527 18:10:17.745162 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:10:18.745873 kubelet[4358]: E0527 18:10:18.745814 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:10:31.745075 containerd[2775]: time="2025-05-27T18:10:31.745021519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:10:31.847136 containerd[2775]: time="2025-05-27T18:10:31.847098097Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:10:31.847371 containerd[2775]: time="2025-05-27T18:10:31.847345738Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:10:31.847436 containerd[2775]: time="2025-05-27T18:10:31.847396778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:10:31.847545 kubelet[4358]: E0527 18:10:31.847501 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:10:31.847801 kubelet[4358]: E0527 18:10:31.847557 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:10:31.847801 kubelet[4358]: E0527 18:10:31.847698 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4wx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-qrwrj_calico-system(e42660bd-427d-4774-a8fa-836372ea3d07): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:10:31.848873 kubelet[4358]: E0527 18:10:31.848846 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:10:32.158277 containerd[2775]: time="2025-05-27T18:10:32.158255405Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"cd39d7aed70c4cf7a4ddc1bc2d0900d480fc94e083f91a389ef2c82d783c5144\" pid:8881 exited_at:{seconds:1748369432 nanos:158081445}" May 27 18:10:33.747222 containerd[2775]: time="2025-05-27T18:10:33.747178428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:10:33.775145 containerd[2775]: time="2025-05-27T18:10:33.775090910Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:10:33.775459 containerd[2775]: time="2025-05-27T18:10:33.775412312Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:10:33.775512 containerd[2775]: time="2025-05-27T18:10:33.775460992Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:10:33.775729 kubelet[4358]: E0527 18:10:33.775656 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:10:33.776030 kubelet[4358]: E0527 18:10:33.775745 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:10:33.776030 kubelet[4358]: E0527 18:10:33.775921 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:7372ac9998994bc2afef8b2a28678866,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:10:33.777667 containerd[2775]: time="2025-05-27T18:10:33.777648081Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:10:33.801641 containerd[2775]: time="2025-05-27T18:10:33.801603187Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:10:33.806470 containerd[2775]: time="2025-05-27T18:10:33.806431808Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:10:33.806538 containerd[2775]: time="2025-05-27T18:10:33.806450368Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:10:33.806659 kubelet[4358]: E0527 18:10:33.806619 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:10:33.806716 kubelet[4358]: E0527 18:10:33.806665 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:10:33.806802 kubelet[4358]: E0527 18:10:33.806766 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:10:33.807935 kubelet[4358]: E0527 18:10:33.807902 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:10:34.306371 containerd[2775]: time="2025-05-27T18:10:34.306348913Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"a23c4120e6a60c411cd88251e0619a057470db179ce3931e1cac06b3db734230\" pid:8903 exited_at:{seconds:1748369434 nanos:306212712}" May 27 18:10:42.739776 containerd[2775]: time="2025-05-27T18:10:42.739699128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"21af1b6ab9d42489b6cacb062bc73fe2dcf5f411ab7b9f84d07beca05674a2f5\" pid:8935 exited_at:{seconds:1748369442 nanos:739446447}" May 27 18:10:44.745011 kubelet[4358]: E0527 18:10:44.744967 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:10:46.745087 kubelet[4358]: E0527 18:10:46.745047 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:10:57.745403 kubelet[4358]: E0527 18:10:57.745352 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:10:59.745062 kubelet[4358]: E0527 18:10:59.745007 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:11:04.303790 containerd[2775]: time="2025-05-27T18:11:04.303748405Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"7217a1b6ff06e76b04a31e1cf6bd2c173fbdeed6256c741e4c627ec7362d0f54\" pid:9003 exited_at:{seconds:1748369464 nanos:303551685}" May 27 18:11:09.746120 kubelet[4358]: E0527 18:11:09.746070 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:11:12.741135 containerd[2775]: time="2025-05-27T18:11:12.741086933Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"60877e996f20f12efce81dcf10ce8196d4bfc4fceed08d11b1a56563c4099830\" pid:9035 exited_at:{seconds:1748369472 nanos:740830493}" May 27 18:11:12.745329 kubelet[4358]: E0527 18:11:12.745275 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:11:23.746136 kubelet[4358]: E0527 18:11:23.746075 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:11:26.744949 kubelet[4358]: E0527 18:11:26.744896 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:11:32.157540 containerd[2775]: time="2025-05-27T18:11:32.157502409Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"8597a2ef6b16f1aab83a2a6b282a51b9d4686b23f9522287dc70c31ac873af11\" pid:9089 exited_at:{seconds:1748369492 nanos:157347449}" May 27 18:11:34.306447 containerd[2775]: time="2025-05-27T18:11:34.306401494Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"38440b62ccf1ea4ef2205e14330b14f325bd031840bd89939f221ed34778c8e0\" pid:9111 exited_at:{seconds:1748369494 nanos:306245653}" May 27 18:11:38.745165 kubelet[4358]: E0527 18:11:38.745122 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:11:38.745643 kubelet[4358]: E0527 18:11:38.745295 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:11:42.740176 containerd[2775]: time="2025-05-27T18:11:42.740135366Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"1f70aae2654bdc8e5d4948ca5a3978dbd6e987cea29b87de8b402d88261bf257\" pid:9140 exited_at:{seconds:1748369502 nanos:739886285}" May 27 18:11:52.745720 containerd[2775]: time="2025-05-27T18:11:52.745676326Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:11:52.746182 kubelet[4358]: E0527 18:11:52.745740 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:11:52.910843 containerd[2775]: time="2025-05-27T18:11:52.910803646Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:11:52.911062 containerd[2775]: time="2025-05-27T18:11:52.911032927Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:11:52.911121 containerd[2775]: time="2025-05-27T18:11:52.911085927Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:11:52.911213 kubelet[4358]: E0527 18:11:52.911180 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:11:52.911255 kubelet[4358]: E0527 18:11:52.911223 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:11:52.911365 kubelet[4358]: E0527 18:11:52.911325 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4wx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-qrwrj_calico-system(e42660bd-427d-4774-a8fa-836372ea3d07): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:11:52.912502 kubelet[4358]: E0527 18:11:52.912474 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:12:03.745066 kubelet[4358]: E0527 18:12:03.745021 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:12:04.307535 containerd[2775]: time="2025-05-27T18:12:04.307499862Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"ed6b4cdd72cc17943c00325534121c612961f9b9e6e36506e7196844478ede71\" pid:9187 exited_at:{seconds:1748369524 nanos:307325182}" May 27 18:12:07.745306 containerd[2775]: time="2025-05-27T18:12:07.745229791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:12:07.779011 containerd[2775]: time="2025-05-27T18:12:07.778985135Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:12:07.779216 containerd[2775]: time="2025-05-27T18:12:07.779189936Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:12:07.779257 containerd[2775]: time="2025-05-27T18:12:07.779239616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:12:07.779372 kubelet[4358]: E0527 18:12:07.779336 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:12:07.779615 kubelet[4358]: E0527 18:12:07.779383 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:12:07.779615 kubelet[4358]: E0527 18:12:07.779467 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:7372ac9998994bc2afef8b2a28678866,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:12:07.781084 containerd[2775]: time="2025-05-27T18:12:07.781066862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:12:07.837667 containerd[2775]: time="2025-05-27T18:12:07.837630518Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:12:07.837891 containerd[2775]: time="2025-05-27T18:12:07.837865318Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:12:07.837953 containerd[2775]: time="2025-05-27T18:12:07.837880358Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:12:07.838028 kubelet[4358]: E0527 18:12:07.837996 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:12:07.838080 kubelet[4358]: E0527 18:12:07.838031 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:12:07.838148 kubelet[4358]: E0527 18:12:07.838112 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:12:07.839294 kubelet[4358]: E0527 18:12:07.839264 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:12:12.740882 containerd[2775]: time="2025-05-27T18:12:12.740850357Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"6b73ca586b58ce4de3c67b83b7b78c629ca89a76645d28944644e41ce7c85095\" pid:9220 exited_at:{seconds:1748369532 nanos:740612757}" May 27 18:12:14.745105 kubelet[4358]: E0527 18:12:14.745066 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:12:20.744983 kubelet[4358]: E0527 18:12:20.744938 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:12:25.745694 kubelet[4358]: E0527 18:12:25.745644 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:12:32.157559 containerd[2775]: time="2025-05-27T18:12:32.157507850Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"69567813c086d206e75d92f638f7f63ff9ebba0aeaf6458d301efdb9d290500b\" pid:9289 exited_at:{seconds:1748369552 nanos:157313369}" May 27 18:12:34.303524 containerd[2775]: time="2025-05-27T18:12:34.303490392Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"fe6ff04c9c8177333eaf0afd4a1492e81397fef777e43ec94b88e082e11e98fc\" pid:9311 exited_at:{seconds:1748369554 nanos:303348712}" May 27 18:12:34.745416 kubelet[4358]: E0527 18:12:34.745363 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:12:38.744833 kubelet[4358]: E0527 18:12:38.744772 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:12:42.742012 containerd[2775]: time="2025-05-27T18:12:42.741964678Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"1bb7f869469f7e01cbabdc8bfeef9ae7fd6d1fe770fca50593009b258bd89036\" pid:9336 exited_at:{seconds:1748369562 nanos:741703597}" May 27 18:12:45.744901 kubelet[4358]: E0527 18:12:45.744866 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:12:49.744910 kubelet[4358]: E0527 18:12:49.744866 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:12:56.744724 kubelet[4358]: E0527 18:12:56.744668 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:13:00.745561 kubelet[4358]: E0527 18:13:00.745488 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:13:04.312666 containerd[2775]: time="2025-05-27T18:13:04.312626099Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"27dc0f850fca233b76c0504ff5db09bc74ec33b18bb95c3dfc3f74c8c5d51ca0\" pid:9376 exited_at:{seconds:1748369584 nanos:312428898}" May 27 18:13:08.745060 kubelet[4358]: E0527 18:13:08.745003 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:13:12.730045 containerd[2775]: time="2025-05-27T18:13:12.729997106Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"34f2a3086c655438b6081d97311921c5a832fb08c9bd0335e983a28b17a74507\" pid:9407 exited_at:{seconds:1748369592 nanos:729769266}" May 27 18:13:12.744978 kubelet[4358]: E0527 18:13:12.744916 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:13:21.745615 kubelet[4358]: E0527 18:13:21.745564 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:13:24.745365 kubelet[4358]: E0527 18:13:24.745311 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:13:25.165466 containerd[2775]: time="2025-05-27T18:13:25.165398322Z" level=warning msg="container event discarded" container=4f5e3a768ef0028ac87c30eaefb7fb109432fa12ce2058146a7f76cc2910019a type=CONTAINER_CREATED_EVENT May 27 18:13:25.176665 containerd[2775]: time="2025-05-27T18:13:25.176600916Z" level=warning msg="container event discarded" container=4f5e3a768ef0028ac87c30eaefb7fb109432fa12ce2058146a7f76cc2910019a type=CONTAINER_STARTED_EVENT May 27 18:13:25.176665 containerd[2775]: time="2025-05-27T18:13:25.176637436Z" level=warning msg="container event discarded" container=30d7a6873f819ac350a1288324f8e5b9c588a272b64775683eed5547a1cabc1d type=CONTAINER_CREATED_EVENT May 27 18:13:25.176665 containerd[2775]: time="2025-05-27T18:13:25.176656716Z" level=warning msg="container event discarded" container=30d7a6873f819ac350a1288324f8e5b9c588a272b64775683eed5547a1cabc1d type=CONTAINER_STARTED_EVENT May 27 18:13:25.176665 containerd[2775]: time="2025-05-27T18:13:25.176663996Z" level=warning msg="container event discarded" container=99511bca5399f5b4d1fe7794402dc419144f242b61708e13be059a454bddfac3 type=CONTAINER_CREATED_EVENT May 27 18:13:25.176665 containerd[2775]: time="2025-05-27T18:13:25.176670916Z" level=warning msg="container event discarded" container=99511bca5399f5b4d1fe7794402dc419144f242b61708e13be059a454bddfac3 type=CONTAINER_STARTED_EVENT May 27 18:13:25.187868 containerd[2775]: time="2025-05-27T18:13:25.187839910Z" level=warning msg="container event discarded" container=f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23 type=CONTAINER_CREATED_EVENT May 27 18:13:25.187868 containerd[2775]: time="2025-05-27T18:13:25.187867230Z" level=warning msg="container event discarded" container=aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42 type=CONTAINER_CREATED_EVENT May 27 18:13:25.187935 containerd[2775]: time="2025-05-27T18:13:25.187876550Z" level=warning msg="container event discarded" container=02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2 type=CONTAINER_CREATED_EVENT May 27 18:13:25.248088 containerd[2775]: time="2025-05-27T18:13:25.248064293Z" level=warning msg="container event discarded" container=02a69392b1399272949c6493a232c64c1c5d60bcf2d1f57d7d47a3209ea4d9d2 type=CONTAINER_STARTED_EVENT May 27 18:13:25.248088 containerd[2775]: time="2025-05-27T18:13:25.248086213Z" level=warning msg="container event discarded" container=f69bd08b7c1f0074766108d4e25acc034bc6e5f588242e68a4be06e4d2d3dd23 type=CONTAINER_STARTED_EVENT May 27 18:13:25.248156 containerd[2775]: time="2025-05-27T18:13:25.248093333Z" level=warning msg="container event discarded" container=aa6ef2778ad67997fb7072241cdb99f4d2da271d7f9f9165fbc6f2f3ad7cdc42 type=CONTAINER_STARTED_EVENT May 27 18:13:32.155363 containerd[2775]: time="2025-05-27T18:13:32.155328387Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"af037fd47022bb9f1325c2367933a5bae7b4c86720066c2b7bcb1e6cfe27151b\" pid:9456 exited_at:{seconds:1748369612 nanos:155152187}" May 27 18:13:33.745917 kubelet[4358]: E0527 18:13:33.745885 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:13:34.295568 containerd[2775]: time="2025-05-27T18:13:34.295525737Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"7ea4137c19eeea78ac93af88fa6ddc5b690375f4fae99e6845ffa856fefc2c79\" pid:9478 exited_at:{seconds:1748369614 nanos:295343496}" May 27 18:13:36.179882 containerd[2775]: time="2025-05-27T18:13:36.179834547Z" level=warning msg="container event discarded" container=fe9a751502649b3264ba4cdd502f3c7e702989f8bdc3e742a1a1b1356035170e type=CONTAINER_CREATED_EVENT May 27 18:13:36.179882 containerd[2775]: time="2025-05-27T18:13:36.179865907Z" level=warning msg="container event discarded" container=fe9a751502649b3264ba4cdd502f3c7e702989f8bdc3e742a1a1b1356035170e type=CONTAINER_STARTED_EVENT May 27 18:13:36.192034 containerd[2775]: time="2025-05-27T18:13:36.191984264Z" level=warning msg="container event discarded" container=d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7 type=CONTAINER_CREATED_EVENT May 27 18:13:36.259226 containerd[2775]: time="2025-05-27T18:13:36.259201949Z" level=warning msg="container event discarded" container=d17a642be2f84c831410cc497e00fc228706e074bf59119d2631f4895bfa55e7 type=CONTAINER_STARTED_EVENT May 27 18:13:36.531578 containerd[2775]: time="2025-05-27T18:13:36.531481177Z" level=warning msg="container event discarded" container=bf3789de9882a71390460751ebcd67cebad4452a5aeb00e386c0e6d4db966928 type=CONTAINER_CREATED_EVENT May 27 18:13:36.531578 containerd[2775]: time="2025-05-27T18:13:36.531501777Z" level=warning msg="container event discarded" container=bf3789de9882a71390460751ebcd67cebad4452a5aeb00e386c0e6d4db966928 type=CONTAINER_STARTED_EVENT May 27 18:13:39.745958 kubelet[4358]: E0527 18:13:39.745896 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:13:39.958164 containerd[2775]: time="2025-05-27T18:13:39.958080557Z" level=warning msg="container event discarded" container=f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579 type=CONTAINER_CREATED_EVENT May 27 18:13:40.009363 containerd[2775]: time="2025-05-27T18:13:40.009272272Z" level=warning msg="container event discarded" container=f49ee27b14cc37608b4c0a0d44bdf67dc788c009bac64f44e0db19e16d2da579 type=CONTAINER_STARTED_EVENT May 27 18:13:42.748941 containerd[2775]: time="2025-05-27T18:13:42.748889082Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"131ab373ca8d9cf47a9fe694877730ca8d9dd5f07765fc95b03fda46fb0bdf7f\" pid:9502 exited_at:{seconds:1748369622 nanos:748613001}" May 27 18:13:45.745329 kubelet[4358]: E0527 18:13:45.745280 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:13:49.922675 containerd[2775]: time="2025-05-27T18:13:49.922597930Z" level=warning msg="container event discarded" container=76e0816753c2b0168fa2936de4848e2daa57435ccdb674d7f928ebc9fcb5882f type=CONTAINER_CREATED_EVENT May 27 18:13:49.922675 containerd[2775]: time="2025-05-27T18:13:49.922647930Z" level=warning msg="container event discarded" container=76e0816753c2b0168fa2936de4848e2daa57435ccdb674d7f928ebc9fcb5882f type=CONTAINER_STARTED_EVENT May 27 18:13:50.099316 containerd[2775]: time="2025-05-27T18:13:50.099285347Z" level=warning msg="container event discarded" container=42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92 type=CONTAINER_CREATED_EVENT May 27 18:13:50.099316 containerd[2775]: time="2025-05-27T18:13:50.099316067Z" level=warning msg="container event discarded" container=42c14c4d6a92894be455f384f61d3fd1754b11736abf4cd13d5231a3980adc92 type=CONTAINER_STARTED_EVENT May 27 18:13:50.616613 containerd[2775]: time="2025-05-27T18:13:50.616555480Z" level=warning msg="container event discarded" container=96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540 type=CONTAINER_CREATED_EVENT May 27 18:13:50.670257 containerd[2775]: time="2025-05-27T18:13:50.670229843Z" level=warning msg="container event discarded" container=96f14628e731a623bb3b767042ed6a72f3be16884841095e9d8d12fb45ffb540 type=CONTAINER_STARTED_EVENT May 27 18:13:51.024061 containerd[2775]: time="2025-05-27T18:13:51.023989678Z" level=warning msg="container event discarded" container=a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f type=CONTAINER_CREATED_EVENT May 27 18:13:51.084659 containerd[2775]: time="2025-05-27T18:13:51.084626462Z" level=warning msg="container event discarded" container=a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f type=CONTAINER_STARTED_EVENT May 27 18:13:51.376829 containerd[2775]: time="2025-05-27T18:13:51.376799510Z" level=warning msg="container event discarded" container=a4f0c0c844d33772c9734db47335423685447764a46ef3cf0252ad59a772b53f type=CONTAINER_STOPPED_EVENT May 27 18:13:53.622321 containerd[2775]: time="2025-05-27T18:13:53.622267576Z" level=warning msg="container event discarded" container=4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d type=CONTAINER_CREATED_EVENT May 27 18:13:53.674449 containerd[2775]: time="2025-05-27T18:13:53.674415654Z" level=warning msg="container event discarded" container=4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d type=CONTAINER_STARTED_EVENT May 27 18:13:53.746765 kubelet[4358]: E0527 18:13:53.746688 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:13:54.186567 containerd[2775]: time="2025-05-27T18:13:54.186496930Z" level=warning msg="container event discarded" container=4edd922f30729b40b50831bdbdb209c85300106104431eb15937425acbad457d type=CONTAINER_STOPPED_EVENT May 27 18:13:57.446524 containerd[2775]: time="2025-05-27T18:13:57.446476758Z" level=warning msg="container event discarded" container=79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8 type=CONTAINER_CREATED_EVENT May 27 18:13:57.505710 containerd[2775]: time="2025-05-27T18:13:57.505672898Z" level=warning msg="container event discarded" container=79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8 type=CONTAINER_STARTED_EVENT May 27 18:13:58.320829 containerd[2775]: time="2025-05-27T18:13:58.320791535Z" level=warning msg="container event discarded" container=2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3 type=CONTAINER_CREATED_EVENT May 27 18:13:58.320829 containerd[2775]: time="2025-05-27T18:13:58.320825415Z" level=warning msg="container event discarded" container=2376f836920e12a8bd099f8deb3b6ba4973c05a6297395ca45d80670dcbcd2f3 type=CONTAINER_STARTED_EVENT May 27 18:13:59.745961 kubelet[4358]: E0527 18:13:59.745923 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:14:04.300763 containerd[2775]: time="2025-05-27T18:14:04.300721188Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"db0d76dec7cf213e65635d7b43eb560726617597302d59da8d79319b7e1248d0\" pid:9567 exited_at:{seconds:1748369644 nanos:300507587}" May 27 18:14:05.745668 kubelet[4358]: E0527 18:14:05.745624 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:14:05.941013 containerd[2775]: time="2025-05-27T18:14:05.940956012Z" level=warning msg="container event discarded" container=36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5 type=CONTAINER_CREATED_EVENT May 27 18:14:05.941013 containerd[2775]: time="2025-05-27T18:14:05.941004052Z" level=warning msg="container event discarded" container=36ba08d2d323fbc1a9a1708c846a5777169ba091d1840dbdbd83cf0f32a26dd5 type=CONTAINER_STARTED_EVENT May 27 18:14:06.013156 containerd[2775]: time="2025-05-27T18:14:06.013017151Z" level=warning msg="container event discarded" container=4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a type=CONTAINER_CREATED_EVENT May 27 18:14:06.013156 containerd[2775]: time="2025-05-27T18:14:06.013063071Z" level=warning msg="container event discarded" container=4ddd9647f6fe98c978047c893f460380db115bd482a7f1c7110038426741fd2a type=CONTAINER_STARTED_EVENT May 27 18:14:06.839793 containerd[2775]: time="2025-05-27T18:14:06.839732303Z" level=warning msg="container event discarded" container=cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2 type=CONTAINER_CREATED_EVENT May 27 18:14:06.898976 containerd[2775]: time="2025-05-27T18:14:06.898944523Z" level=warning msg="container event discarded" container=cfa43317e823b1edecc9286c2fd4dad64ad5d3fb0cf1d89f314b3160fd095bd2 type=CONTAINER_STARTED_EVENT May 27 18:14:06.899110 containerd[2775]: time="2025-05-27T18:14:06.899070763Z" level=warning msg="container event discarded" container=9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4 type=CONTAINER_CREATED_EVENT May 27 18:14:06.899110 containerd[2775]: time="2025-05-27T18:14:06.899088683Z" level=warning msg="container event discarded" container=9e1f37d5ced39778d70870b1a849726e54e1f7b72233342895acee9a151c51f4 type=CONTAINER_STARTED_EVENT May 27 18:14:06.912293 containerd[2775]: time="2025-05-27T18:14:06.912250803Z" level=warning msg="container event discarded" container=560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa type=CONTAINER_CREATED_EVENT May 27 18:14:06.963477 containerd[2775]: time="2025-05-27T18:14:06.963439399Z" level=warning msg="container event discarded" container=560cfc8153c93ea97b8f028d1df31ecc806ff8f2327ba52f5147857caed78aaa type=CONTAINER_STARTED_EVENT May 27 18:14:09.905716 containerd[2775]: time="2025-05-27T18:14:09.905663699Z" level=warning msg="container event discarded" container=fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d type=CONTAINER_CREATED_EVENT May 27 18:14:09.905716 containerd[2775]: time="2025-05-27T18:14:09.905694939Z" level=warning msg="container event discarded" container=fb746ea8d4b9fef494c046f780cd7bd6b78da6ebc91efc19a876408bd3d6924d type=CONTAINER_STARTED_EVENT May 27 18:14:10.006040 containerd[2775]: time="2025-05-27T18:14:10.005990443Z" level=warning msg="container event discarded" container=9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a type=CONTAINER_CREATED_EVENT May 27 18:14:10.006040 containerd[2775]: time="2025-05-27T18:14:10.006020124Z" level=warning msg="container event discarded" container=9a7bde6eeff17f54bd033e7b90a3de1a0278531105404f78877741f86291843a type=CONTAINER_STARTED_EVENT May 27 18:14:10.115121 containerd[2775]: time="2025-05-27T18:14:10.115073495Z" level=warning msg="container event discarded" container=72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad type=CONTAINER_CREATED_EVENT May 27 18:14:10.115121 containerd[2775]: time="2025-05-27T18:14:10.115102015Z" level=warning msg="container event discarded" container=72dc415f86fc38e401fdc7384e8d513680375c11d8d0e2729ad30e7ea6c6a2ad type=CONTAINER_STARTED_EVENT May 27 18:14:10.115121 containerd[2775]: time="2025-05-27T18:14:10.115109615Z" level=warning msg="container event discarded" container=bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8 type=CONTAINER_CREATED_EVENT May 27 18:14:10.186432 containerd[2775]: time="2025-05-27T18:14:10.186361791Z" level=warning msg="container event discarded" container=bfc7559be8ec0cd2c9402e3f5a82f8ed000e0e506cbf7553b497bfa84f36bba8 type=CONTAINER_STARTED_EVENT May 27 18:14:10.186432 containerd[2775]: time="2025-05-27T18:14:10.186384312Z" level=warning msg="container event discarded" container=44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484 type=CONTAINER_CREATED_EVENT May 27 18:14:10.186432 containerd[2775]: time="2025-05-27T18:14:10.186391472Z" level=warning msg="container event discarded" container=44ecc7b57683179452b58274b9efe427531493b35ab7d9594a05bc9952ab2484 type=CONTAINER_STARTED_EVENT May 27 18:14:10.202522 containerd[2775]: time="2025-05-27T18:14:10.202493721Z" level=warning msg="container event discarded" container=b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f type=CONTAINER_CREATED_EVENT May 27 18:14:10.260702 containerd[2775]: time="2025-05-27T18:14:10.260685097Z" level=warning msg="container event discarded" container=b04e83363199e8f3510e7932bdc7dc18bbfb49397902727ae7eed636b217336f type=CONTAINER_STARTED_EVENT May 27 18:14:10.751489 containerd[2775]: time="2025-05-27T18:14:10.751450028Z" level=warning msg="container event discarded" container=7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016 type=CONTAINER_CREATED_EVENT May 27 18:14:10.818696 containerd[2775]: time="2025-05-27T18:14:10.818662113Z" level=warning msg="container event discarded" container=7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016 type=CONTAINER_STARTED_EVENT May 27 18:14:11.142606 containerd[2775]: time="2025-05-27T18:14:11.142509377Z" level=warning msg="container event discarded" container=015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236 type=CONTAINER_CREATED_EVENT May 27 18:14:11.206821 containerd[2775]: time="2025-05-27T18:14:11.206778092Z" level=warning msg="container event discarded" container=015aa5a9cfd9ed3549dfd54d8528ccfe40cd5c1015650a5e0ee17ed39440a236 type=CONTAINER_STARTED_EVENT May 27 18:14:11.706631 containerd[2775]: time="2025-05-27T18:14:11.706595331Z" level=warning msg="container event discarded" container=ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3 type=CONTAINER_CREATED_EVENT May 27 18:14:11.762938 containerd[2775]: time="2025-05-27T18:14:11.762909702Z" level=warning msg="container event discarded" container=ad31ad7952230827c0ded83778a35932ee4b2111c6f67dd4b009ff611ca43be3 type=CONTAINER_STARTED_EVENT May 27 18:14:12.734949 containerd[2775]: time="2025-05-27T18:14:12.734913335Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"8c32bed6f74ac8c9bccfc48e86c29fcd387bb889053d2465ecab4cdbc9b10783\" pid:9596 exited_at:{seconds:1748369652 nanos:734688854}" May 27 18:14:13.745621 kubelet[4358]: E0527 18:14:13.745561 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:14:18.744882 kubelet[4358]: E0527 18:14:18.744817 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:14:20.357736 update_engine[2765]: I20250527 18:14:20.357643 2765 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 27 18:14:20.357736 update_engine[2765]: I20250527 18:14:20.357702 2765 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 27 18:14:20.358118 update_engine[2765]: I20250527 18:14:20.357918 2765 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 27 18:14:20.358269 update_engine[2765]: I20250527 18:14:20.358246 2765 omaha_request_params.cc:62] Current group set to alpha May 27 18:14:20.358440 update_engine[2765]: I20250527 18:14:20.358422 2765 update_attempter.cc:499] Already updated boot flags. Skipping. May 27 18:14:20.358497 update_engine[2765]: I20250527 18:14:20.358484 2765 update_attempter.cc:643] Scheduling an action processor start. May 27 18:14:20.358793 update_engine[2765]: I20250527 18:14:20.358541 2765 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 18:14:20.358793 update_engine[2765]: I20250527 18:14:20.358575 2765 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 27 18:14:20.358793 update_engine[2765]: I20250527 18:14:20.358641 2765 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 18:14:20.358793 update_engine[2765]: I20250527 18:14:20.358649 2765 omaha_request_action.cc:272] Request: May 27 18:14:20.358793 update_engine[2765]: May 27 18:14:20.358793 update_engine[2765]: May 27 18:14:20.358793 update_engine[2765]: May 27 18:14:20.358793 update_engine[2765]: May 27 18:14:20.358793 update_engine[2765]: May 27 18:14:20.358793 update_engine[2765]: May 27 18:14:20.358793 update_engine[2765]: May 27 18:14:20.358793 update_engine[2765]: May 27 18:14:20.358793 update_engine[2765]: I20250527 18:14:20.358654 2765 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:14:20.359083 locksmithd[2828]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 27 18:14:20.359708 update_engine[2765]: I20250527 18:14:20.359683 2765 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:14:20.360109 update_engine[2765]: I20250527 18:14:20.360082 2765 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:14:20.360759 update_engine[2765]: E20250527 18:14:20.360735 2765 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:14:20.360866 update_engine[2765]: I20250527 18:14:20.360850 2765 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 27 18:14:24.745864 kubelet[4358]: E0527 18:14:24.745813 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:14:30.307865 update_engine[2765]: I20250527 18:14:30.307691 2765 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:14:30.308516 update_engine[2765]: I20250527 18:14:30.308097 2765 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:14:30.308516 update_engine[2765]: I20250527 18:14:30.308473 2765 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:14:30.309096 update_engine[2765]: E20250527 18:14:30.309067 2765 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:14:30.309163 update_engine[2765]: I20250527 18:14:30.309144 2765 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 27 18:14:32.153555 containerd[2775]: time="2025-05-27T18:14:32.153512848Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"3fe5bf82e57726ba59e58ee0e4cb225ce18c9022c1e6b0aca7d0c4ad9f82097b\" pid:9637 exited_at:{seconds:1748369672 nanos:153289488}" May 27 18:14:33.746061 containerd[2775]: time="2025-05-27T18:14:33.746016446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:14:33.774593 containerd[2775]: time="2025-05-27T18:14:33.774546492Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:14:33.774824 containerd[2775]: time="2025-05-27T18:14:33.774792293Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:14:33.774885 containerd[2775]: time="2025-05-27T18:14:33.774841733Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:14:33.774978 kubelet[4358]: E0527 18:14:33.774930 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:14:33.775217 kubelet[4358]: E0527 18:14:33.774989 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:14:33.775217 kubelet[4358]: E0527 18:14:33.775113 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4wx4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-qrwrj_calico-system(e42660bd-427d-4774-a8fa-836372ea3d07): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:14:33.776283 kubelet[4358]: E0527 18:14:33.776256 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:14:34.311914 containerd[2775]: time="2025-05-27T18:14:34.311887325Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"33979f715e6ae2f079892719eda3a4c8df293ae52192fc7e6e8532b440d5e7c4\" pid:9659 exited_at:{seconds:1748369674 nanos:311732884}" May 27 18:14:37.746008 kubelet[4358]: E0527 18:14:37.745947 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:14:40.307677 update_engine[2765]: I20250527 18:14:40.307605 2765 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:14:40.308457 update_engine[2765]: I20250527 18:14:40.308207 2765 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:14:40.308457 update_engine[2765]: I20250527 18:14:40.308421 2765 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:14:40.308864 update_engine[2765]: E20250527 18:14:40.308801 2765 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:14:40.308864 update_engine[2765]: I20250527 18:14:40.308844 2765 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 27 18:14:42.740223 containerd[2775]: time="2025-05-27T18:14:42.740175606Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"8e3169c6440e8485b0056a2f1a3ff57b616b9ac761995da57e6be9dd1b558516\" pid:9694 exited_at:{seconds:1748369682 nanos:739856206}" May 27 18:14:45.745258 kubelet[4358]: E0527 18:14:45.745206 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:14:50.307671 update_engine[2765]: I20250527 18:14:50.307610 2765 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:14:50.308027 update_engine[2765]: I20250527 18:14:50.307847 2765 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:14:50.308080 update_engine[2765]: I20250527 18:14:50.308049 2765 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:14:50.308475 update_engine[2765]: E20250527 18:14:50.308456 2765 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:14:50.308515 update_engine[2765]: I20250527 18:14:50.308491 2765 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 18:14:50.308515 update_engine[2765]: I20250527 18:14:50.308498 2765 omaha_request_action.cc:617] Omaha request response: May 27 18:14:50.308604 update_engine[2765]: E20250527 18:14:50.308573 2765 omaha_request_action.cc:636] Omaha request network transfer failed. May 27 18:14:50.308632 update_engine[2765]: I20250527 18:14:50.308610 2765 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 27 18:14:50.308632 update_engine[2765]: I20250527 18:14:50.308615 2765 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 18:14:50.308632 update_engine[2765]: I20250527 18:14:50.308620 2765 update_attempter.cc:306] Processing Done. May 27 18:14:50.308693 update_engine[2765]: E20250527 18:14:50.308632 2765 update_attempter.cc:619] Update failed. May 27 18:14:50.308693 update_engine[2765]: I20250527 18:14:50.308638 2765 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 27 18:14:50.308693 update_engine[2765]: I20250527 18:14:50.308642 2765 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 27 18:14:50.308693 update_engine[2765]: I20250527 18:14:50.308647 2765 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 27 18:14:50.308765 update_engine[2765]: I20250527 18:14:50.308707 2765 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 18:14:50.308765 update_engine[2765]: I20250527 18:14:50.308727 2765 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 18:14:50.308765 update_engine[2765]: I20250527 18:14:50.308732 2765 omaha_request_action.cc:272] Request: May 27 18:14:50.308765 update_engine[2765]: May 27 18:14:50.308765 update_engine[2765]: May 27 18:14:50.308765 update_engine[2765]: May 27 18:14:50.308765 update_engine[2765]: May 27 18:14:50.308765 update_engine[2765]: May 27 18:14:50.308765 update_engine[2765]: May 27 18:14:50.308765 update_engine[2765]: I20250527 18:14:50.308737 2765 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:14:50.308929 update_engine[2765]: I20250527 18:14:50.308848 2765 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:14:50.308994 locksmithd[2828]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 27 18:14:50.309214 update_engine[2765]: I20250527 18:14:50.309010 2765 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:14:50.309479 update_engine[2765]: E20250527 18:14:50.309463 2765 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:14:50.309503 update_engine[2765]: I20250527 18:14:50.309492 2765 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 18:14:50.309503 update_engine[2765]: I20250527 18:14:50.309499 2765 omaha_request_action.cc:617] Omaha request response: May 27 18:14:50.309539 update_engine[2765]: I20250527 18:14:50.309504 2765 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 18:14:50.309539 update_engine[2765]: I20250527 18:14:50.309507 2765 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 18:14:50.309539 update_engine[2765]: I20250527 18:14:50.309511 2765 update_attempter.cc:306] Processing Done. May 27 18:14:50.309539 update_engine[2765]: I20250527 18:14:50.309516 2765 update_attempter.cc:310] Error event sent. May 27 18:14:50.309539 update_engine[2765]: I20250527 18:14:50.309522 2765 update_check_scheduler.cc:74] Next update check in 44m25s May 27 18:14:50.309708 locksmithd[2828]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 27 18:14:50.744879 containerd[2775]: time="2025-05-27T18:14:50.744850160Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:14:50.774759 containerd[2775]: time="2025-05-27T18:14:50.774712411Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:14:50.775006 containerd[2775]: time="2025-05-27T18:14:50.774980092Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:14:50.775045 containerd[2775]: time="2025-05-27T18:14:50.775024532Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:14:50.775135 kubelet[4358]: E0527 18:14:50.775107 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:14:50.775368 kubelet[4358]: E0527 18:14:50.775145 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:14:50.775368 kubelet[4358]: E0527 18:14:50.775225 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:7372ac9998994bc2afef8b2a28678866,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:14:50.776849 containerd[2775]: time="2025-05-27T18:14:50.776833658Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:14:50.856426 containerd[2775]: time="2025-05-27T18:14:50.856386059Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:14:50.856672 containerd[2775]: time="2025-05-27T18:14:50.856641820Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:14:50.856717 containerd[2775]: time="2025-05-27T18:14:50.856670180Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:14:50.856835 kubelet[4358]: E0527 18:14:50.856794 4358 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:14:50.856877 kubelet[4358]: E0527 18:14:50.856841 4358 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:14:50.856958 kubelet[4358]: E0527 18:14:50.856925 4358 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zwk6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5d46c699d6-jdjnd_calico-system(7b02d917-929c-457a-9f01-27a3d9745228): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:14:50.858115 kubelet[4358]: E0527 18:14:50.858077 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:14:56.745452 kubelet[4358]: E0527 18:14:56.745414 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:15:03.745416 kubelet[4358]: E0527 18:15:03.745371 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:15:04.305470 containerd[2775]: time="2025-05-27T18:15:04.305417469Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"d91ddb2f0e458c33a50db1ef6ebbec7d9ebf3c6fb3ddd26aa578b57015c99594\" pid:9733 exited_at:{seconds:1748369704 nanos:305245388}" May 27 18:15:07.745028 kubelet[4358]: E0527 18:15:07.744972 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:15:12.740388 containerd[2775]: time="2025-05-27T18:15:12.740351768Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"6e7fcd67f95ec1022ff6214dbd3dd029b89dc0c8f4d27db6ad3cf83288396019\" pid:9760 exited_at:{seconds:1748369712 nanos:740090687}" May 27 18:15:17.745929 kubelet[4358]: E0527 18:15:17.745877 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:15:22.744838 kubelet[4358]: E0527 18:15:22.744791 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:15:31.746002 kubelet[4358]: E0527 18:15:31.745949 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:15:32.160314 containerd[2775]: time="2025-05-27T18:15:32.160268189Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"fc38b421839806e93f7242f786bf142686d500c0788d169174c58351ed145706\" pid:9833 exited_at:{seconds:1748369732 nanos:159484707}" May 27 18:15:34.305662 containerd[2775]: time="2025-05-27T18:15:34.305619545Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"b0a6dcf35cba389dbcff46a3dc75d8146d24b89c1620a4b0f2684d93edac3cd5\" pid:9855 exited_at:{seconds:1748369734 nanos:305440825}" May 27 18:15:37.745409 kubelet[4358]: E0527 18:15:37.745353 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:15:42.734033 containerd[2775]: time="2025-05-27T18:15:42.733990543Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"99346894652e9231018d788dd9b6c3e2934b299ce7bf7dec77b68cc6f48a62ad\" pid:9879 exited_at:{seconds:1748369742 nanos:733760302}" May 27 18:15:44.745886 kubelet[4358]: E0527 18:15:44.745826 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:15:52.744819 kubelet[4358]: E0527 18:15:52.744771 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:15:55.745118 kubelet[4358]: E0527 18:15:55.745082 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:16:04.303838 containerd[2775]: time="2025-05-27T18:16:04.303792292Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"c7dcfc7e8baf09144ce15fe47f5a3b3a121907d44654a47f5c9db0c2c1ebb31c\" pid:9913 exited_at:{seconds:1748369764 nanos:303420451}" May 27 18:16:04.745406 kubelet[4358]: E0527 18:16:04.745378 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:16:08.745253 kubelet[4358]: E0527 18:16:08.745216 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:16:12.742616 containerd[2775]: time="2025-05-27T18:16:12.742559321Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"39ca2c77c51c0d0e48bc680067428995b7cd791347540d38854d695c946c285d\" pid:9939 exited_at:{seconds:1748369772 nanos:742293041}" May 27 18:16:19.745614 kubelet[4358]: E0527 18:16:19.745555 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:16:19.746077 kubelet[4358]: E0527 18:16:19.745823 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:16:31.744916 kubelet[4358]: E0527 18:16:31.744820 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:16:32.146673 containerd[2775]: time="2025-05-27T18:16:32.146637893Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"aaf35bc49c29c0628e39e8bd2e254cf5d3e18e0a0c64b7918b44ff67ccca284a\" pid:9979 exited_at:{seconds:1748369792 nanos:146461692}" May 27 18:16:32.745049 kubelet[4358]: E0527 18:16:32.744991 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:16:34.304842 containerd[2775]: time="2025-05-27T18:16:34.304799407Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"2ee22f15d6a61b70dd8e9d079b2a528659e0e54195fc96652ab9e463329ed456\" pid:10001 exited_at:{seconds:1748369794 nanos:304603726}" May 27 18:16:42.739828 containerd[2775]: time="2025-05-27T18:16:42.739753304Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"de4121d8f4ad5aef812dbff56c9152cf0a6cfcbfc5d908a2c382f1edefb68bc3\" pid:10024 exited_at:{seconds:1748369802 nanos:739534344}" May 27 18:16:43.744826 kubelet[4358]: E0527 18:16:43.744785 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:16:46.745458 kubelet[4358]: E0527 18:16:46.745411 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:16:55.005443 systemd[1]: Started sshd@9-147.28.228.207:22-139.178.89.65:37892.service - OpenSSH per-connection server daemon (139.178.89.65:37892). May 27 18:16:55.422337 sshd[10054]: Accepted publickey for core from 139.178.89.65 port 37892 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:16:55.423721 sshd-session[10054]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:55.427516 systemd-logind[2754]: New session 12 of user core. May 27 18:16:55.447699 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 18:16:55.776784 sshd[10056]: Connection closed by 139.178.89.65 port 37892 May 27 18:16:55.777239 sshd-session[10054]: pam_unix(sshd:session): session closed for user core May 27 18:16:55.780211 systemd[1]: sshd@9-147.28.228.207:22-139.178.89.65:37892.service: Deactivated successfully. May 27 18:16:55.781828 systemd[1]: session-12.scope: Deactivated successfully. May 27 18:16:55.782414 systemd-logind[2754]: Session 12 logged out. Waiting for processes to exit. May 27 18:16:55.783268 systemd-logind[2754]: Removed session 12. May 27 18:16:56.745024 kubelet[4358]: E0527 18:16:56.744983 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:16:59.745814 kubelet[4358]: E0527 18:16:59.745757 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:17:00.857494 systemd[1]: Started sshd@10-147.28.228.207:22-139.178.89.65:37894.service - OpenSSH per-connection server daemon (139.178.89.65:37894). May 27 18:17:01.279545 sshd[10093]: Accepted publickey for core from 139.178.89.65 port 37894 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:01.280797 sshd-session[10093]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:01.283973 systemd-logind[2754]: New session 13 of user core. May 27 18:17:01.304778 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 18:17:01.632039 sshd[10095]: Connection closed by 139.178.89.65 port 37894 May 27 18:17:01.632415 sshd-session[10093]: pam_unix(sshd:session): session closed for user core May 27 18:17:01.635459 systemd[1]: sshd@10-147.28.228.207:22-139.178.89.65:37894.service: Deactivated successfully. May 27 18:17:01.638080 systemd[1]: session-13.scope: Deactivated successfully. May 27 18:17:01.638653 systemd-logind[2754]: Session 13 logged out. Waiting for processes to exit. May 27 18:17:01.639447 systemd-logind[2754]: Removed session 13. May 27 18:17:01.715319 systemd[1]: Started sshd@11-147.28.228.207:22-139.178.89.65:37908.service - OpenSSH per-connection server daemon (139.178.89.65:37908). May 27 18:17:02.131702 sshd[10131]: Accepted publickey for core from 139.178.89.65 port 37908 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:02.132990 sshd-session[10131]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:02.136239 systemd-logind[2754]: New session 14 of user core. May 27 18:17:02.157697 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 18:17:02.506495 sshd[10133]: Connection closed by 139.178.89.65 port 37908 May 27 18:17:02.506833 sshd-session[10131]: pam_unix(sshd:session): session closed for user core May 27 18:17:02.509887 systemd[1]: sshd@11-147.28.228.207:22-139.178.89.65:37908.service: Deactivated successfully. May 27 18:17:02.511521 systemd[1]: session-14.scope: Deactivated successfully. May 27 18:17:02.512127 systemd-logind[2754]: Session 14 logged out. Waiting for processes to exit. May 27 18:17:02.512928 systemd-logind[2754]: Removed session 14. May 27 18:17:02.592374 systemd[1]: Started sshd@12-147.28.228.207:22-139.178.89.65:37918.service - OpenSSH per-connection server daemon (139.178.89.65:37918). May 27 18:17:03.009447 sshd[10169]: Accepted publickey for core from 139.178.89.65 port 37918 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:03.010749 sshd-session[10169]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:03.014031 systemd-logind[2754]: New session 15 of user core. May 27 18:17:03.029747 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 18:17:03.357471 sshd[10171]: Connection closed by 139.178.89.65 port 37918 May 27 18:17:03.357854 sshd-session[10169]: pam_unix(sshd:session): session closed for user core May 27 18:17:03.360879 systemd[1]: sshd@12-147.28.228.207:22-139.178.89.65:37918.service: Deactivated successfully. May 27 18:17:03.362545 systemd[1]: session-15.scope: Deactivated successfully. May 27 18:17:03.363153 systemd-logind[2754]: Session 15 logged out. Waiting for processes to exit. May 27 18:17:03.363975 systemd-logind[2754]: Removed session 15. May 27 18:17:04.306601 containerd[2775]: time="2025-05-27T18:17:04.306562523Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"2dd0eeb116b2e77960ccf91f8ee6089a2cf35c1a7f230704a0ff5b3c35d13706\" pid:10234 exited_at:{seconds:1748369824 nanos:306383843}" May 27 18:17:07.745571 kubelet[4358]: E0527 18:17:07.745533 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:17:08.433461 systemd[1]: Started sshd@13-147.28.228.207:22-139.178.89.65:45304.service - OpenSSH per-connection server daemon (139.178.89.65:45304). May 27 18:17:08.845619 sshd[10258]: Accepted publickey for core from 139.178.89.65 port 45304 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:08.846822 sshd-session[10258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:08.849928 systemd-logind[2754]: New session 16 of user core. May 27 18:17:08.865688 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 18:17:09.192660 sshd[10261]: Connection closed by 139.178.89.65 port 45304 May 27 18:17:09.192964 sshd-session[10258]: pam_unix(sshd:session): session closed for user core May 27 18:17:09.195958 systemd[1]: sshd@13-147.28.228.207:22-139.178.89.65:45304.service: Deactivated successfully. May 27 18:17:09.198199 systemd[1]: session-16.scope: Deactivated successfully. May 27 18:17:09.198787 systemd-logind[2754]: Session 16 logged out. Waiting for processes to exit. May 27 18:17:09.199632 systemd-logind[2754]: Removed session 16. May 27 18:17:10.745068 kubelet[4358]: E0527 18:17:10.745007 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:17:12.739772 containerd[2775]: time="2025-05-27T18:17:12.739719495Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"11d92a23ab99ab8f40b3eca5bffb6bf53f335ef1251c95a4457d57a2cb4e20ec\" pid:10306 exited_at:{seconds:1748369832 nanos:739405694}" May 27 18:17:14.271357 systemd[1]: Started sshd@14-147.28.228.207:22-139.178.89.65:49116.service - OpenSSH per-connection server daemon (139.178.89.65:49116). May 27 18:17:14.693086 sshd[10330]: Accepted publickey for core from 139.178.89.65 port 49116 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:14.694258 sshd-session[10330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:14.697380 systemd-logind[2754]: New session 17 of user core. May 27 18:17:14.720685 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 18:17:15.047757 sshd[10333]: Connection closed by 139.178.89.65 port 49116 May 27 18:17:15.048060 sshd-session[10330]: pam_unix(sshd:session): session closed for user core May 27 18:17:15.051100 systemd[1]: sshd@14-147.28.228.207:22-139.178.89.65:49116.service: Deactivated successfully. May 27 18:17:15.052754 systemd[1]: session-17.scope: Deactivated successfully. May 27 18:17:15.053338 systemd-logind[2754]: Session 17 logged out. Waiting for processes to exit. May 27 18:17:15.054169 systemd-logind[2754]: Removed session 17. May 27 18:17:18.745475 kubelet[4358]: E0527 18:17:18.745428 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:17:20.130431 systemd[1]: Started sshd@15-147.28.228.207:22-139.178.89.65:49118.service - OpenSSH per-connection server daemon (139.178.89.65:49118). May 27 18:17:20.541808 sshd[10371]: Accepted publickey for core from 139.178.89.65 port 49118 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:20.543067 sshd-session[10371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:20.546161 systemd-logind[2754]: New session 18 of user core. May 27 18:17:20.571753 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 18:17:20.891064 sshd[10373]: Connection closed by 139.178.89.65 port 49118 May 27 18:17:20.891429 sshd-session[10371]: pam_unix(sshd:session): session closed for user core May 27 18:17:20.894484 systemd[1]: sshd@15-147.28.228.207:22-139.178.89.65:49118.service: Deactivated successfully. May 27 18:17:20.896725 systemd[1]: session-18.scope: Deactivated successfully. May 27 18:17:20.897319 systemd-logind[2754]: Session 18 logged out. Waiting for processes to exit. May 27 18:17:20.898145 systemd-logind[2754]: Removed session 18. May 27 18:17:20.973509 systemd[1]: Started sshd@16-147.28.228.207:22-139.178.89.65:49134.service - OpenSSH per-connection server daemon (139.178.89.65:49134). May 27 18:17:21.390037 sshd[10408]: Accepted publickey for core from 139.178.89.65 port 49134 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:21.391358 sshd-session[10408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:21.394759 systemd-logind[2754]: New session 19 of user core. May 27 18:17:21.418711 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 18:17:21.771967 sshd[10410]: Connection closed by 139.178.89.65 port 49134 May 27 18:17:21.772247 sshd-session[10408]: pam_unix(sshd:session): session closed for user core May 27 18:17:21.775247 systemd[1]: sshd@16-147.28.228.207:22-139.178.89.65:49134.service: Deactivated successfully. May 27 18:17:21.776907 systemd[1]: session-19.scope: Deactivated successfully. May 27 18:17:21.777463 systemd-logind[2754]: Session 19 logged out. Waiting for processes to exit. May 27 18:17:21.778295 systemd-logind[2754]: Removed session 19. May 27 18:17:21.845350 systemd[1]: Started sshd@17-147.28.228.207:22-139.178.89.65:49136.service - OpenSSH per-connection server daemon (139.178.89.65:49136). May 27 18:17:22.259128 sshd[10443]: Accepted publickey for core from 139.178.89.65 port 49136 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:22.260402 sshd-session[10443]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:22.263712 systemd-logind[2754]: New session 20 of user core. May 27 18:17:22.291704 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 18:17:23.924257 sshd[10445]: Connection closed by 139.178.89.65 port 49136 May 27 18:17:23.924641 sshd-session[10443]: pam_unix(sshd:session): session closed for user core May 27 18:17:23.927661 systemd[1]: sshd@17-147.28.228.207:22-139.178.89.65:49136.service: Deactivated successfully. May 27 18:17:23.930090 systemd[1]: session-20.scope: Deactivated successfully. May 27 18:17:23.930352 systemd[1]: session-20.scope: Consumed 4.011s CPU time, 138.5M memory peak. May 27 18:17:23.930791 systemd-logind[2754]: Session 20 logged out. Waiting for processes to exit. May 27 18:17:23.931632 systemd-logind[2754]: Removed session 20. May 27 18:17:24.001209 systemd[1]: Started sshd@18-147.28.228.207:22-139.178.89.65:42832.service - OpenSSH per-connection server daemon (139.178.89.65:42832). May 27 18:17:24.416484 sshd[10541]: Accepted publickey for core from 139.178.89.65 port 42832 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:24.417735 sshd-session[10541]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:24.420908 systemd-logind[2754]: New session 21 of user core. May 27 18:17:24.439739 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 18:17:24.851776 sshd[10543]: Connection closed by 139.178.89.65 port 42832 May 27 18:17:24.852139 sshd-session[10541]: pam_unix(sshd:session): session closed for user core May 27 18:17:24.855153 systemd[1]: sshd@18-147.28.228.207:22-139.178.89.65:42832.service: Deactivated successfully. May 27 18:17:24.856788 systemd[1]: session-21.scope: Deactivated successfully. May 27 18:17:24.857380 systemd-logind[2754]: Session 21 logged out. Waiting for processes to exit. May 27 18:17:24.858173 systemd-logind[2754]: Removed session 21. May 27 18:17:24.938260 systemd[1]: Started sshd@19-147.28.228.207:22-139.178.89.65:42836.service - OpenSSH per-connection server daemon (139.178.89.65:42836). May 27 18:17:25.364189 sshd[10592]: Accepted publickey for core from 139.178.89.65 port 42836 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:25.365321 sshd-session[10592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:25.368467 systemd-logind[2754]: New session 22 of user core. May 27 18:17:25.391766 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 18:17:25.729873 sshd[10594]: Connection closed by 139.178.89.65 port 42836 May 27 18:17:25.730198 sshd-session[10592]: pam_unix(sshd:session): session closed for user core May 27 18:17:25.733187 systemd[1]: sshd@19-147.28.228.207:22-139.178.89.65:42836.service: Deactivated successfully. May 27 18:17:25.734822 systemd[1]: session-22.scope: Deactivated successfully. May 27 18:17:25.735391 systemd-logind[2754]: Session 22 logged out. Waiting for processes to exit. May 27 18:17:25.736182 systemd-logind[2754]: Removed session 22. May 27 18:17:25.745934 kubelet[4358]: E0527 18:17:25.745900 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:17:29.745509 kubelet[4358]: E0527 18:17:29.745468 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07" May 27 18:17:30.816361 systemd[1]: Started sshd@20-147.28.228.207:22-139.178.89.65:42840.service - OpenSSH per-connection server daemon (139.178.89.65:42840). May 27 18:17:31.240534 sshd[10637]: Accepted publickey for core from 139.178.89.65 port 42840 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:31.241803 sshd-session[10637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:31.245087 systemd-logind[2754]: New session 23 of user core. May 27 18:17:31.257696 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 18:17:31.593425 sshd[10639]: Connection closed by 139.178.89.65 port 42840 May 27 18:17:31.593741 sshd-session[10637]: pam_unix(sshd:session): session closed for user core May 27 18:17:31.596656 systemd[1]: sshd@20-147.28.228.207:22-139.178.89.65:42840.service: Deactivated successfully. May 27 18:17:31.598958 systemd[1]: session-23.scope: Deactivated successfully. May 27 18:17:31.599625 systemd-logind[2754]: Session 23 logged out. Waiting for processes to exit. May 27 18:17:31.600487 systemd-logind[2754]: Removed session 23. May 27 18:17:32.156748 containerd[2775]: time="2025-05-27T18:17:32.156713585Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"f1ec1a7b759a9a79d83f671b68aa874989235f342c9397a5bd6f39f308a6a66f\" pid:10678 exited_at:{seconds:1748369852 nanos:156541265}" May 27 18:17:34.306566 containerd[2775]: time="2025-05-27T18:17:34.306530634Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7ce9e9a0c401aa8f082790f574d8eacd4ceae143950432409abce8799993c016\" id:\"951d67eb51cf0fda4aff7ddb3b0dfed2648327258c7e3be68a2b836e161b55aa\" pid:10700 exited_at:{seconds:1748369854 nanos:306376194}" May 27 18:17:36.665532 systemd[1]: Started sshd@21-147.28.228.207:22-139.178.89.65:53850.service - OpenSSH per-connection server daemon (139.178.89.65:53850). May 27 18:17:36.744958 kubelet[4358]: E0527 18:17:36.744918 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5d46c699d6-jdjnd" podUID="7b02d917-929c-457a-9f01-27a3d9745228" May 27 18:17:37.079409 sshd[10723]: Accepted publickey for core from 139.178.89.65 port 53850 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:37.080654 sshd-session[10723]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:37.083922 systemd-logind[2754]: New session 24 of user core. May 27 18:17:37.104742 systemd[1]: Started session-24.scope - Session 24 of User core. May 27 18:17:37.424485 sshd[10726]: Connection closed by 139.178.89.65 port 53850 May 27 18:17:37.424782 sshd-session[10723]: pam_unix(sshd:session): session closed for user core May 27 18:17:37.427870 systemd[1]: sshd@21-147.28.228.207:22-139.178.89.65:53850.service: Deactivated successfully. May 27 18:17:37.430224 systemd[1]: session-24.scope: Deactivated successfully. May 27 18:17:37.430836 systemd-logind[2754]: Session 24 logged out. Waiting for processes to exit. May 27 18:17:37.431705 systemd-logind[2754]: Removed session 24. May 27 18:17:42.505588 systemd[1]: Started sshd@22-147.28.228.207:22-139.178.89.65:53852.service - OpenSSH per-connection server daemon (139.178.89.65:53852). May 27 18:17:42.745803 containerd[2775]: time="2025-05-27T18:17:42.745704784Z" level=info msg="TaskExit event in podsandbox handler container_id:\"79f2af845ec8738c1555fbd703d878bc9a69bad1a41625904e40a073604e11b8\" id:\"6eea766b24b043c020a3245f1c3e4feb197fe5d4a6093ea2598a56a664e6bf54\" pid:10775 exited_at:{seconds:1748369862 nanos:745440383}" May 27 18:17:42.942731 sshd[10761]: Accepted publickey for core from 139.178.89.65 port 53852 ssh2: RSA SHA256:nSFcPeSP4MLZWYvMyRcHhiEEDDwXzMlIp/vLEga1u/g May 27 18:17:42.944040 sshd-session[10761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:42.947410 systemd-logind[2754]: New session 25 of user core. May 27 18:17:42.965742 systemd[1]: Started session-25.scope - Session 25 of User core. May 27 18:17:43.292808 sshd[10800]: Connection closed by 139.178.89.65 port 53852 May 27 18:17:43.293094 sshd-session[10761]: pam_unix(sshd:session): session closed for user core May 27 18:17:43.296314 systemd[1]: sshd@22-147.28.228.207:22-139.178.89.65:53852.service: Deactivated successfully. May 27 18:17:43.298026 systemd[1]: session-25.scope: Deactivated successfully. May 27 18:17:43.298661 systemd-logind[2754]: Session 25 logged out. Waiting for processes to exit. May 27 18:17:43.299510 systemd-logind[2754]: Removed session 25. May 27 18:17:43.746056 kubelet[4358]: E0527 18:17:43.745983 4358 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-qrwrj" podUID="e42660bd-427d-4774-a8fa-836372ea3d07"